BNNSLossFunctionSigmoidCrossEntropy
Sigmoid activation on input logits, and independent computation of cross-entropy loss for each class.
Declaration
var BNNSLossFunctionSigmoidCrossEntropy: BNNSLossFunction { get }Discussion
BNNSLossFunctionSigmoidCrossEntropy performs sigmoid on input logits and computes cross entropy loss for each class independently.
You can smooth labels can according to smoothing factor.
You can scale the loss with either a scalar value or weight matrix, and reduce the loss according to a reduction function.