BNNSActivationFunctionGELUApproximation
An activation function that evaluates the Gaussian error linear units (GELU) approximation on its input.
Declaration
var BNNSActivationFunctionGELUApproximation: BNNSActivationFunction { get }Discussion
This constant defines an activation function that returns values using the following operation:
0.5f * x * (1.0f + tanh(alpha*(x + beta * x * x * x)))The following illustrates the output that the activation function generates from inputs in the range -10...10, an alpha of 0.1 and a beta of 1.0:
[Image]
See Also
Raw Values
init(_:)init(rawValue:)rawValueBNNSActivationFunctionAbsBNNSActivationFunctionCELUBNNSActivationFunctionClampedLeakyRectifiedLinearBNNSActivationFunctionELUBNNSActivationFunctionErfBNNSActivationFunctionGELUBNNSActivationFunctionGELUApproximation2BNNSActivationFunctionGELUApproximationSigmoidBNNSActivationFunctionGumbelBNNSActivationFunctionGumbelMaxBNNSActivationFunctionHardShrinkBNNSActivationFunctionHardSigmoid