mindspore.nn.SmoothL1Loss

View Source On Gitee
class mindspore.nn.SmoothL1Loss(beta=1.0, reduction='none')[source]

SmoothL1 loss function, if the absolute error element-wise between the predicted value and the target value is less than the set threshold beta, the square term is used, otherwise the absolute error term is used.

Given two input \(x,\ y\), the SmoothL1Loss can be described as follows:

\[\begin{split}L_{i} = \begin{cases} \frac{0.5 (x_i - y_i)^{2}}{\beta}, & \text{if } |x_i - y_i| < {\beta} \\ |x_i - y_i| - 0.5 {\beta}, & \text{otherwise.} \end{cases}\end{split}\]

Where \({\beta}\) represents the threshold beta.

If reduction is not none, then:

\[\begin{split}L = \begin{cases} \operatorname{mean}(L_{i}), & \text{if reduction} = \text{'mean';}\\ \operatorname{sum}(L_{i}), & \text{if reduction} = \text{'sum'.} \end{cases}\end{split}\]

Note

Parameters
  • beta (float) – The loss function calculates the threshold of the transformation between L1Loss and L2Loss. Default: 1.0 .

  • reduction (str, optional) –

    Apply specific reduction method to the output: 'none' , 'mean' , 'sum' . Default: 'none' .

    • 'none': no reduction will be applied.

    • 'mean': compute and return the mean of elements in the output.

    • 'sum': the output elements will be summed.

Inputs:
  • logits (Tensor) - Predictive value. Tensor of any dimension. Data type must be one of float16 or float32.

  • labels (Tensor) - Ground truth data, same shape and dtype as the logits.

Outputs:

Tensor, if reduction is 'none', then output is a tensor with the same shape as logits. Otherwise the shape of output tensor is \(()\).

Raises
  • TypeError – If beta is not a float.

  • ValueError – If reduction is not one of 'none', 'mean', 'sum'.

  • TypeError – If logits or labels are not Tensor.

  • TypeError – If dtype of logits or labels is neither float16 not float32.

  • TypeError – If dtype of logits is not the same as labels.

  • ValueError – If beta is less than or equal to 0.

  • ValueError – If shape of logits is not the same as labels.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> loss = nn.SmoothL1Loss()
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([1, 2, 2]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
[0.  0.  0.5]