BNNSLossFunctionSoftmaxCrossEntropy
Softmax activation on input logits, and computation of cross-entropy loss with one-hot encoded labels.
Declaration
var BNNSLossFunctionSoftmaxCrossEntropy: BNNSLossFunction { get }Discussion
BNNSLossFunctionSoftmaxCrossEntropy performs softmax on input logits and computes cross entropy loss with one hot encoded labels.
You can smooth labels can according to smoothing factor.
You can scale the loss with either a scalar value or weight matrix, and reduce the loss according to a reduction function.