BNNSActivationFunctionHardSigmoid
An activation function that returns the hard sigmoid function of its input.
Declaration
var BNNSActivationFunctionHardSigmoid: BNNSActivationFunction { get }Discussion
This constant defines an activation function that returns values using the following operation:
max(0, min(1, alpha*x + beta))The following illustrates the output that the activation function generates from inputs in the range -10...10, an alpha of 1.0, and a beta of 0.5:
[Image]
See Also
Raw Values
init(_:)init(rawValue:)rawValueBNNSActivationFunctionAbsBNNSActivationFunctionCELUBNNSActivationFunctionClampedLeakyRectifiedLinearBNNSActivationFunctionELUBNNSActivationFunctionErfBNNSActivationFunctionGELUBNNSActivationFunctionGELUApproximationBNNSActivationFunctionGELUApproximation2BNNSActivationFunctionGELUApproximationSigmoidBNNSActivationFunctionGumbelBNNSActivationFunctionGumbelMaxBNNSActivationFunctionHardShrink