leakyReLU(with:alphaTensor:name:)
Computes the leaky rectified linear unit (ReLU) activation function on the input tensor.
Declaration
func leakyReLU(with tensor: MPSGraphTensor, alphaTensor: MPSGraphTensor, name: String?) -> MPSGraphTensorParameters
- tensor:
The input tensor.
- name:
The name for the operation.
Return Value
A valid MPSGraphTensor object
Discussion
The operation is: f(x) = max(x, alpha). This operation supports broadcasting with the alpha tensor.