Activation function

Last revised by Andrew Murphy on 2 Aug 2021

In neural networks, activation functions perform a transformation on a weighted sum of inputs plus biases to a neuron in order to compute its output. Using a biological analogy, the activation function determines the “firing rate” of a neuron in response to an input or stimulus. These functions introduce non-linearities into the neural networks enabling them to perform complex tasks such as image recognition and language processing. Without non-linear activation functions, artificial neural networks behave as simple linear regression models.

Such functions include:

  • sigmoid function
  • rectified linear unit (ReLU) function
  • hyperbolic tangent (Tanh) function

ADVERTISEMENT: Supporters see fewer/no ads

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.