比较与torch.nn.functional.binary_cross_entropy的功能差异

View Source On Gitee

torch.nn.functional.binary_cross_entropy

torch.nn.functional.binary_cross_entropy(
    input,
    target,
    weight=None,
    size_average=None,
    reduce=None,
    reduction='mean'
) -> Tensor

更多内容详见torch.nn.functional.binary_cross_entropy

mindspore.nn.BCELoss

mindspore.ops.binary_cross_entropy(
    logits,
    labels,
    weight=None,
    reduction='mean'
) -> Tensor

更多内容详见mindspore.ops.binary_cross_entropy

差异对比

PyTorch:计算目标值和预测值之间的二值交叉熵损失值。

MindSpore:MindSpore此API实现功能与PyTorch基本一致。

分类

子类

PyTorch

MindSpore

差异

参数

参数1

input

logits

功能一致,参数名不同

参数2

target

labels

功能一致,参数名不同

参数3

weight

weight

功能一致

参数4

size_average

-

PyTorch的已弃用参数,功能由reduction参数取代

参数5

reduce

-

PyTorch的已弃用参数,功能由reduction参数取代

参数6

reduction

reduction

功能一致

代码示例1

两API实现功能一致,用法相同。

# PyTorch
import torch
import torch.nn.functional as F
from torch import tensor

logits = tensor([0.1, 0.2, 0.3], requires_grad=True)
labels = tensor([1., 1., 1.])
loss = F.binary_cross_entropy(logits, labels)
print(loss.detach().numpy())
# 1.7053319

# MindSpore
import mindspore
import numpy as np
from mindspore import Tensor
from mindspore import ops

logits = Tensor(np.array([0.1, 0.2, 0.3]), mindspore.float32)
labels = Tensor(np.array([1., 1., 1.]), mindspore.float32)
loss = ops.binary_cross_entropy(logits, labels)
print(loss)
# 1.7053319