Skip to content

Neural Network classifier

Simone Maurizio La Cava edited this page Jul 23, 2020 · 3 revisions

The Random Forest classifier a classifier composed of a multilayer artificial neural network, resulting by a set of preceptrons.

Before starting the classification process, it may be useful to introduce the Perceptron and the Neural Network concepts (if you already know them, you can jump directly to the procedure paragraph).





Perceptron classifier

The perceptron is a supervised learning model which can implement a classification function.

A single perceptron can only implement a linear classification function which can decide what is the belonging class of a sample represented as a vector of number, combining a set of weights with the feature vector.

The naive model is composed by:

  • the perceptron, which simply represents a summation
  • an input for any feature
  • a weighted connection between any input and the perceptron
  • the output
  • a bias, which corresponds to the weight of a constant input value
  • the activation function, which defines the output value corresponding to the result of the summation

Note that the weight of any connection can also be negative.

So, the problem of training consists in assigning the appropriate weights to the connections between the perceptron and each feature: the weights are initially assigned randomly and, for each training sample, if it generate a wrong classification result, the weights are modified following a learning rule and updated, proportionally to a learning rate which stabilizes the process.

This process is iterated until all points are correctly classified, if the samples are linearly separable (otherwise, until the minimum possible error is reached).


The classification phase consists, for any test sample, in multiply each feature's value for the corresponding weight and sum them, also considering the bias value, and assigning the class corresponding to the value resulting by the activation function.





Neural Network classifier

The Neural Network classifier is a network of multiple perceptron.

In these networks, there are input units, outputs units and often also units in the middle, called hidden units.

So, a perceptron can take as inputs the values of the features of the sample, or the output value of another perceptron, while the output can be the class label or a value used as input by other perceptrons.


A feed-forward multi-layer (FF-ML) architecture is composed by units arranged into layers: input layer (fictitious units corresponding to inputs), output layer (one unit for two-class problems) and one or more hidden layers.

In these networks, there are no recurrent connections and every hidden and output unit receives inputs only from units of the previous layer, and every input and hidden unit sends its output only to units in the next layer.

In these network, the error is propagated between the layers in order to update the weights in each perceptron.

Furthermore, the dataset is subdivided into three subsets:

  • The training set, which is used to train the model
  • The validation set, which is used in order to tune some parameters of the model
  • The test set, which is used to evaluate the performance





The Neural Network classification step

Athena allows you to set some parameters of the classifier, such as the number of repetitions of the training-test cycle, the number of hidden layers from which it is composed and the validation function, or select the default parameters which will automatically be set by the toolbox (1000 repetitions, 10 hidden layers and a validation fraction equal to 0.2, evaluated with a training-test split with a training fraction equal to 0.5).

Furthermore, you have to select one of the evaluation methods.

After all the repetitions will be finished, inside your main data directory will be created, inside a Classification folder, a file which contains all the used parameters and the performance.

The confusion matrix, with the resulting average accuracy value, will be showed in a figure.

Also the ROC curve will be shown with the corresponding AUC value.

Finally, you can repeat it by changing the parameters or the evaluation method, or you can return to the classifiers list.

Clone this wiki locally