Anny on

Namespace: ACTIVATION

ACTIVATION

Activation functions and their derivatives for a Neuron.

Source:

Members

(static) identity

Simply passes the input to the output with no transformation.

Simply passes the input to the output with no transformation.

Source:

(static) logistic

A smoothed step function or an 'S' shape.

A smoothed step function or an 'S' shape. Also called the sigmoid function, though there are many sigmoid functions.

Source:

(static) optimalTanh

Modified hyperbolic tangent function.

Modified hyperbolic tangent function. Optimized for faster convergence.

Source:

(static) rectifier

Simply max(0, x).

Simply max(0, x). Interestingly the derivative of the rectifier turns out to be the logistic function.

Source:

(static) softplus

A smooth approximation of the rectifier.

A smooth approximation of the rectifier.

Source:

(static) tanh

The hyperbolic tangent function.

The hyperbolic tangent function. A sigmoid curve, like the logistic function, except it has a range of (-1,+1). Often performs better than the logistic function because of its symmetry. Ideal for customization of multilayer perceptrons, particularly the hidden layers.

Source: