mindspore.nn.RMSELoss

View Source On Gitee
class mindspore.nn.RMSELoss[source]

RMSELoss creates a criterion to measure the root mean square error between \(x\) and \(y\) element-wise, where \(x\) is the input and \(y\) is the labels.

For simplicity, let \(x\) and \(y\) be 1-dimensional Tensor with length \(N\), the loss of \(x\) and \(y\) is given as:

\[loss = \sqrt{\frac{1}{N}\sum_{i=1}^{N}{(x_i-y_i)^2}}\]
Inputs:
  • logits (Tensor) - Tensor of shape \((N, *)\) where \(*\) means, any number of additional dimensions.

  • labels (Tensor) - Tensor of shape \((N, *)\), same shape as the logits in common cases. However, it supports the shape of logits is different from the shape of labels and they should be broadcasted to each other.

Outputs:

Tensor, weighted loss float tensor and its shape is \(()\).

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, nn
>>> import numpy as np
>>> # Case 1: logits.shape = labels.shape = (3,)
>>> loss = nn.RMSELoss()
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([1, 2, 2]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
0.57735026
>>> # Case 2: logits.shape = (3,), labels.shape = (2, 3)
>>> loss = nn.RMSELoss()
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([[1, 1, 1], [1, 2, 2]]), mindspore.float32)
>>> output = loss(logits, labels)
>>> print(output)
1.0