mindspore.nn.LossBase
- class mindspore.nn.LossBase(reduction='mean')[源代码]
损失函数的基类。
自定义损失函数时应重写 construct ,并使用方法 self.get_loss 将 reduction 应用于loss计算。
- 参数:
reduction (str,可选) - 指定应用于输出结果的规约计算方式,可选
'none'、'mean'、'sum',默认值:'mean'。"none":不应用规约方法。"mean":计算输出元素的(加权)平均值。"sum":计算输出元素的总和。
- 异常:
ValueError - reduction 不为
'none'、'mean'或'sum'。
- 支持平台:
AscendGPUCPU
样例:
>>> import mindspore >>> from mindspore import ops, Tensor, nn >>> import numpy as np >>> >>> class Net(nn.LossBase): ... def __init__(self, reduction='mean'): ... super(Net, self).__init__(reduction) ... self.abs = ops.Abs() ... ... def construct(self, logits, labels): ... x = self.abs(logits - labels) ... output = self.get_loss(x) ... axis = self.get_axis(x) ... return output, axis >>> net = Net() >>> # Case 1: logits.shape = labels.shape = (3,) >>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32) >>> labels = Tensor(np.array([1, 2, 2]), mindspore.float32) >>> output, axis = net(logits, labels) >>> print(output) 0.33333334 >>> print(axis) (0,) >>> # Case 2: logits.shape = labels.shape = (3, 3) >>> logits = Tensor(np.array([[1, 2, 3],[1, 2, 3],[1, 2, 3]]), mindspore.float32) >>> labels = Tensor(np.array([[1, 2, 2],[1, 2, 3],[1, 2, 3]]), mindspore.float32) >>> output, axis = net(logits, labels) >>> print(output) 0.11111111 >>> print(axis) (0, 1)