BNNSActivationFunctionSELU
An activation function that evaluates the scaled exponential linear units (SELU) on its input.
Declaration
var BNNSActivationFunctionSELU: BNNSActivationFunction { get }Discussion
This constant defines an activation function that returns values using the following operation:
// λ and ɑ have the values given by Klambauer, Unterthiner and Mayr
// (~1.0507 and ~1.6733, respectively)
if x < 0
λɑ(exp(x)-1)
else
λxThe following illustrates the output that the activation function generates from inputs in the range -10...10:
[Image]
See Also
Raw Values
init(_:)init(rawValue:)rawValueBNNSActivationFunctionAbsBNNSActivationFunctionCELUBNNSActivationFunctionClampedLeakyRectifiedLinearBNNSActivationFunctionELUBNNSActivationFunctionErfBNNSActivationFunctionGELUBNNSActivationFunctionGELUApproximationBNNSActivationFunctionGELUApproximation2BNNSActivationFunctionGELUApproximationSigmoidBNNSActivationFunctionGumbelBNNSActivationFunctionGumbelMaxBNNSActivationFunctionHardShrink