Activation

Icon:

Function:

A layer contains only an activation function. Activation functions are mathematical equations that determine the output of a neural network that should be activated (“fired”) or not.  Activation functions also help normalize the network outputs to a range between 1 and 0 or between -1 and 1.




Parameters:



Parameter:



Activation:
  • Linear
  • ReLU
  • Sigmoid
  • Tanh
  • Softmax
  • SoftPlus
  • Softsign