prelu(alpha:)
Adds a Parametric ReLU (PReLU) activation operation to the current graph.
Declaration
func prelu(alpha: [Float]) -> BNNSGraph.Builder.Tensor<T>Discussion
Performs the operation prelu(self) = max(0, self) + alpha[i] * min(0, self), for each channel i.
self must have at least two dimensions.
The number of elements in alpha must be either 1 or self.shape[1].
The operation applies the weight alpha[i] to channel i while performing the activation.