Contents

BNNS.ActivationFunction.hardSwish(alpha:beta:)

An activation function that returns the hard swish function of its input.

Declaration

case hardSwish(alpha: Float, beta: Float)

Discussion

This constant defines an activation function that returns values using the following operation:

HardSwish(x) = x * (ReLU6(x + 3.0) * 1.0/6.0)

The following illustrates the output that the activation function generates from inputs in the range -10...10:

[Image]

See Also

Related Documentation

Activation Functions