BNNS.ActivationFunction.softplus(alpha:beta:)
An activation function that returns the softplus function of its input.
Declaration
case softplus(alpha: Float, beta: Float)Discussion
This constant defines an activation function that returns values using the following operation:
alpha * log( 1 + exp(beta*x) )The following illustrates the output that the activation function generates from inputs in the range -10...10, with an alpha of 1.0, and a beta of 0.5:
[Image]
See Also
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.geluApproximation2(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)