leakyReLU(alpha:)
Adds a Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.
Declaration
func leakyReLU(alpha: Float = 0.01) -> BNNSGraph.Builder.Tensor<T>Parameters
- alpha:
The
alphavalue.
Adds a Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.
func leakyReLU(alpha: Float = 0.01) -> BNNSGraph.Builder.Tensor<T>The alpha value.