BNNS.ActivationFunction.selu
An activation function that evaluates the scaled exponential linear units (SELU) on its input.
Declaration
case seluDiscussion
This constant defines an activation function that returns values using the following operation:
// λ and ɑ have the values given by Klambauer, Unterthiner and Mayr
// (~1.0507 and ~1.6733, respectively)
if x < 0
λɑ(exp(x)-1)
else
λxThe following illustrates the output that the activation function generates from inputs in the range -10...10:
[Image]
See Also
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.geluApproximation2(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)