Contents

BNNSActivationFunctionSiLU

An activation function that returns the sigmoid linear unit (SiLU) function of its input.

Declaration

var BNNSActivationFunctionSiLU: BNNSActivationFunction { get }

Discussion

This constant defines an activation function that returns values using the following operation:

SiLU(x) = x * sigmoid(x)

The following illustrates the output that the activation function generates from inputs in the range -10...10:

[Image]

See Also

Raw Values