BNNS.ActivationFunction.silu
An activation function that returns the sigmoid linear unit (SiLU) function of its input.
Declaration
case siluDiscussion
This constant defines an activation function that returns values using the following operation:
SiLU(x) = x * sigmoid(x)The following illustrates the output that the activation function generates from inputs in the range -10...10:
[Image]
See Also
Related Documentation
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.geluApproximation2(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)