BNNSActivationFunctionELU
An activation function that evaluates the exponential linear units (ELU) on its input.
Declaration
var BNNSActivationFunctionELU: BNNSActivationFunction { get }Discussion
This constant defines an activation function that returns values using the following operation:
if x < 0
alpha*(exp(x) - 1)
else
xThe following illustrates the output that the activation function generates from inputs in the range -10...10 and an alpha of 1:
[Image]
See Also
Raw Values
init(_:)init(rawValue:)rawValueBNNSActivationFunctionAbsBNNSActivationFunctionCELUBNNSActivationFunctionClampedLeakyRectifiedLinearBNNSActivationFunctionErfBNNSActivationFunctionGELUBNNSActivationFunctionGELUApproximationBNNSActivationFunctionGELUApproximation2BNNSActivationFunctionGELUApproximationSigmoidBNNSActivationFunctionGumbelBNNSActivationFunctionGumbelMaxBNNSActivationFunctionHardShrinkBNNSActivationFunctionHardSigmoid