Contents

leakyReLUGradient(withIncomingGradient:sourceTensor:alphaTensor:name:)

Computes the gradient of the leaky rectified linear unit (ReLU) activation.

Declaration

func leakyReLUGradient(withIncomingGradient gradient: MPSGraphTensor, sourceTensor source: MPSGraphTensor, alphaTensor: MPSGraphTensor, name: String?) -> MPSGraphTensor

Parameters

  • gradient:

    The incoming gradient tensor.

  • source:

    The input tensor in forward pass.

  • name:

    The name for the operation.

Return Value

A valid MPSGraphTensor object

Discussion

This operation supports broadcasting with the alpha tensor.