Contents

thresholdedReLU(alpha:)

Adds a Thresholded Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.

Declaration

func thresholdedReLU(alpha: Float) -> BNNSGraph.Builder.Tensor<T>

Parameters

  • alpha:

    The alpha value.