BNNS.ActivationFunction.softShrink(alpha:)
An activation function that returns zero when the absolute input is less than alpha, otherwise it returns its input minus alpha.
Declaration
case softShrink(alpha: Float)Discussion
This constant defines an activation function that returns values using the following operation:
abs(x) < abs(alpha)
0 if abs(x) < abs(alpha)
else
x-copysign(alpha, x)The following illustrates the output that the activation function generates from inputs in the range -10...10 and an alpha of 5:
[Image]
See Also
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.geluApproximation2(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)