Contents

clampedReLU(alpha:beta:)

Adds a Clamped Rectified Linear Unit (ReLU) activation operation to the current graph.

Declaration

func clampedReLU(alpha: Float, beta: Float) -> BNNSGraph.Builder.Tensor<T>

Parameters

  • alpha:

    The alpha value.

  • beta:

    The beta value.