Contents

reLU(with:name:)

Computes the ReLU (rectified linear activation unit) function with the input tensor.

Declaration

func reLU(with tensor: MPSGraphTensor, name: String?) -> MPSGraphTensor

Parameters

  • tensor:

    The input tensor.

  • name:

    The name for the operation.

Return Value

A valid MPSGraphTensor object.

Discussion

The operation is: f(x) = max(x, 0).