-
fork this repo and clone it locally!
-
navigate into the folder with the above files
-
type
runMultiClassLogisticRegressionNeuralNetwork
in Octave or Matlab command line to see an example of a trained 2 layer neural network to recognize hand written digits with 95% success rate. -
type
runMultiClassNeuralNetworkWith3Layers
in Octave or Matlab command line to see an example of a trained 3 layer neural network to recognize hand written digits with 97% sucess rate.
+ For problems like computer vision, the # of features is very large. For example a 50 x 50 pixel greyscale image will have
2500 pixels. And if we trained linear or logistic regression with all pixel to pixel relationships then we would have 2500
features x 2500 other features giving us 3 million plus features which is WAY TOO MUCH.
- Representing a Neuron
-
where x = [x_0; x_1; x_2; x_3] where x_0 always = 1 and theta = [theta_0; theta_1; theta_2; theta_3]
-
Now if we have a whole bunch of artificial neurons in a few layers we get a neural network like:
-
where (a_i)^(j) = activation of unit i in layer j and BigTheta^(j) = matrix of weights controlling function mapping from layer j to layer j + 1
-
The dimension of BigTheta^(j) is s_j+1 x (s_j + 1) if a neural network has s_j units in layer j & s_j+1 units in layer j + 1.
-
This is the set of equations that describes the above neural network configuration during forward propagation:
-
Now one or more of the neurons in a neural network can simulate a logical operations including AND, OR, NOT, XNOR. Here is an artificial neuron simulating the logical AND operation:
-