mindspore.ops.soft_margin_loss
- mindspore.ops.soft_margin_loss(input, target, reduction='mean')[source]
- Calculate the soft margin loss of input and target. - Creates a criterion that optimizes a two-class classification logistic loss between input tensor \(x\) and target tensor \(y\) (containing 1 or -1). \[\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}\]- where \(x.nelement()\) is the number of elements of \(x\). - Warning - This is an experimental API that is subject to change or deletion. - Parameters
- input (Tensor) – Predict data. Data type must be float16, float32, bfloat16 (Atlas training series products are not supported). 
- target (Tensor) – Ground truth data, with the same shape as input. In GE mode, the data type should be the same as input. 
- reduction (str, optional) – - Apply specific reduction method to the output: - 'none',- 'mean',- 'sum'. Default:- 'mean'.- 'none': no reduction will be applied.
- 'mean': compute and return the mean of elements in the output.
- 'sum': the output elements will be summed.
 
 
- Returns
- Tensor or Scalar. If reduction is - 'none', its shape is the same as input. Otherwise, a scalar value will be returned.
- Raises
- TypeError – If input or target is not a Tensor. 
- TypeError – If dtype of input or target is not float16, float32, bfloat16 (Atlas training series products are not supported). 
- ValueError – If shape of input is not the same as that of target. 
- ValueError – If reduction is not one of - 'none',- 'mean'or- 'sum'.
 
 - Supported Platforms:
- Ascend- GPU
 - Examples - >>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, ops >>> logits = Tensor(np.array([[0.3, 0.7], [0.5, 0.5]]), mindspore.float32) >>> labels = Tensor(np.array([[-1, 1], [1, -1]]), mindspore.float32) >>> output = ops.soft_margin_loss(logits, labels) >>> print(output) 0.6764238