BNNS.ActivationFunction.geluApproximation2(alpha:beta:)
An activation function that provides a fast evaluation of the Gaussian error linear units (GELU) approximation on its input.
Declaration
case geluApproximation2(alpha: Float, beta: Float)Discussion
This constant defines an activation function that returns values using the following operation:
x * (ReLU 6(x + 3.0) * 1.0 / 6.0)The following illustrates the output that the activation function generates from inputs in the range -10...10, an alpha of 0.1, and a beta of 1.0. The thinner, dashed line shows, for comparison, the result of BNNSActivationFunctionGELUApproximation using the same alpha and beta values:
[Image]
See Also
Activation Functions
BNNS.ActivationFunction.absBNNS.ActivationFunction.celu(alpha:)BNNS.ActivationFunction.clamp(bounds:)BNNS.ActivationFunction.clampedLeakyRectifiedLinear(alpha:beta:)BNNS.ActivationFunction.elu(alpha:)BNNS.ActivationFunction.geluApproximation(alpha:beta:)BNNS.ActivationFunction.gumbel(alpha:beta:)BNNS.ActivationFunction.gumbelMax(alpha:beta:)BNNS.ActivationFunction.hardShrink(alpha:)BNNS.ActivationFunction.hardSigmoid(alpha:beta:)BNNS.ActivationFunction.hardSwish(alpha:beta:)BNNS.ActivationFunction.identityBNNS.ActivationFunction.leakyRectifiedLinear(alpha:)BNNS.ActivationFunction.linear(alpha:)BNNS.ActivationFunction.linearWithBias(alpha:beta:)