reLUGradient(withIncomingGradient:sourceTensor:name:)
Computes the gradient of the ReLU (rectified linear activation unit) function using the incoming gradient.
Declaration
func reLUGradient(withIncomingGradient gradient: MPSGraphTensor, sourceTensor source: MPSGraphTensor, name: String?) -> MPSGraphTensorParameters
- gradient:
The incoming gradient tensor.
- source:
The input tensor from forward pass.
- name:
The name for the operation.
Return Value
A valid MPSGraphTensor object.