thresholdedReLU(alpha:)
Adds a Thresholded Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.
Declaration
func thresholdedReLU(alpha: Float) -> BNNSGraph.Builder.Tensor<T>Parameters
- alpha:
The
alphavalue.
Adds a Thresholded Leaky Rectified Linear Unit (ReLU) activation operation to the current graph.
func thresholdedReLU(alpha: Float) -> BNNSGraph.Builder.Tensor<T>The alpha value.