Contents

BNNS.ActivationFunction.geluApproximation(alpha:beta:)

An activation function that evaluates the Gaussian error linear units (GELU) approximation on its input.

Declaration

case geluApproximation(alpha: Float, beta: Float)

Discussion

This constant defines an activation function that returns values using the following operation:

0.5f * x * (1.0f + tanh(alpha*(x + beta * x * x * x)))

The following illustrates the output that the activation function generates from inputs in the range -10...10, an alpha of 0.1 and a beta of 1.0:

[Image]

See Also

Activation Functions