Contents

leakyReLU(alpha:)

Adds a Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.

Declaration

func leakyReLU(alpha: Float = 0.01) -> BNNSGraph.Builder.Tensor<T>

Parameters

  • alpha:

    The alpha value.