BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)
An activation function that returns its input clamped to beta when that is greater than or equal to zero, otherwise it returns its input multiplied by alpha clamped to beta.
Declaration
case clampedLeakyRectifiedLinear(alpha: Float, beta: Float)Discussion
This constant defines an activation function that returns values using the following operation:
if x < 0
min(alpha*x, beta)
else
min(x, beta) The following illustrates the output that the activation function generates from inputs in the range -10...10, an alpha of 1 and a beta of 5:
[Image]
See Also
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.geluApproximation2(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)BNNS.ActivationFunction.linearWithBias(alpha:beta:)