leakyReLU(with:alpha:name:)
Computes the leaky rectified linear unit (ReLU) activation function on the input tensor.
Declaration
func leakyReLU(with tensor: MPSGraphTensor, alpha: Double, name: String?) -> MPSGraphTensorParameters
- tensor:
An input tensor.
- alpha:
The scalar value alpha used by all elements in the input tensor.
- name:
The name for the operation.
Return Value
A valid MPSGraphTensor object
Discussion
The operation is: f(x) = max(x, alpha).