mindspore.nn.SoftMarginLoss
- class mindspore.nn.SoftMarginLoss(reduction='mean')[source]
A loss class for two-class classification problems.
SoftMarginLoss creates a criterion that optimizes a two-class classification logistic loss between input tensor \(x\) and labels tensor \(y\) (containing 1 or -1).
\[\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{x.nelement()}\]\(x.nelement()\) represents the number of element of x .
- Parameters
reduction (str, optional) –
Apply specific reduction method to the output:
'none','mean','sum'. Default:'mean'.'none': no reduction will be applied.'mean': compute and return the mean of elements in the output.'sum': the output elements will be summed.
- Inputs:
logits (Tensor) - Predict data. Data type must be float16, float32, bfloat16 (Among them, the Atlas training series products do not support bfloat16).
labels (Tensor) - Ground truth data, with the same shape as logits. In GE mode, the data type should be the same as logits.
- Outputs:
Tensor or Scalar, if reduction is
'none', its shape is the same as logits. Otherwise, a scalar value will be returned.
- Raises
TypeError – If logits or labels is not a Tensor.
TypeError – If dtype of logits or labels is not float16, float32, bfloat16 (Among them, the Atlas training series products do not support bfloat16).
ValueError – If shape of logits is not the same as labels.
ValueError – If reduction is not one of
'none','mean','sum'.
- Supported Platforms:
AscendGPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> import numpy as np >>> loss = nn.SoftMarginLoss() >>> logits = Tensor(np.array([[0.3, 0.7], [0.5, 0.5]]), mindspore.float32) >>> labels = Tensor(np.array([[-1, 1], [1, -1]]), mindspore.float32) >>> output = loss(logits, labels) >>> print(output) 0.6764238