BNNSActivationFunctionHardSwish
An activation function that returns the hard swish function of its input.
Declaration
var BNNSActivationFunctionHardSwish: BNNSActivationFunction { get }Discussion
This constant defines an activation function that returns values using the following operation:
HardSwish(x) = x * (ReLU6(x + 3.0) * 1.0/6.0)The following illustrates the output that the activation function generates from inputs in the range -10...10:
[Image]
See Also
Raw Values
init(_:)init(rawValue:)rawValueBNNSActivationFunctionAbsBNNSActivationFunctionCELUBNNSActivationFunctionClampedLeakyRectifiedLinearBNNSActivationFunctionELUBNNSActivationFunctionErfBNNSActivationFunctionGELUBNNSActivationFunctionGELUApproximationBNNSActivationFunctionGELUApproximation2BNNSActivationFunctionGELUApproximationSigmoidBNNSActivationFunctionGumbelBNNSActivationFunctionGumbelMaxBNNSActivationFunctionHardShrink