relu()

Adds a Rectified Linear Unit (ReLU) activation operation to the current graph.

Declaration

func relu() -> BNNSGraph.Builder.Tensor<T>