The two main activation functions used in current applications are both sigmoids, and are described by
in which the former function is a hyperbolic tangent which ranges from -1 to 1, and the latter is equivalent in shape but ranges from 0 to 1. Here is the output of the th node (neuron) and is the weighted sum of the input synapses. More specialized activation functions include radial basis functions which are used in another class of supervised neural network models.
The multilayer perceptron consists of an input and an output layer with one or more hidden layers of nonlinearly-activating nodes. Each node in one layer connects with a certain weight to every other node in the following layer.
We represent the error in output node in the th data point by , where is the target value and is the value produced by the perceptron. We then make corrections to the weights of the nodes based on those corrections which minimize the energy of error in the entire output, given by
By the theory of differentials, we find our change in each weight to be
where is the output of the previous neuron and is the learning rate, which is carefully selected to ensure that the weights converge to a response that is neither too specific nor too general. In programming applications, typically ranges from 0.2 to 0.8.
The derivative to be calculated depends on the input synapse sum , which itself varies. It is easy to prove that for an output node this derivative can be simplified to
where is the derivative of the activation function describe above, which itself does not vary. The analysis is more difficult for the change in weights to a hidden node, but it can be shown that the relevant derivative is
Note that this depends on the change in weights of the th nodes, which represent the output layer. So to change the hidden layer weights, we must first change the output layer weights according to the derivative of the activation function, and so this algorithm represents a backpropagation of the activation function.
Currently, they are most commonly seen in speech recognition, image recognition, and machine translation software, but they have also seen applications in other fields such as cyber security. In general, their most important use has been in the growing field of artificial intelligence, where the multilayer perceptron's power comes from its similarity to certain biological neural networks in the human brain.