比较与tf.nn.softmax_cross_entropy_with_logits的功能差异

查看源文件

tf.nn.softmax_cross_entropy_with_logits

class tf.nn.softmax_cross_entropy_with_logits(
    _sentinel=None,
    labels=None,
    logits=None,
    dim=-1,
    name=None,
    axis=None
)

更多内容详见tf.nn.softmax_cross_entropy_with_logits

mindspore.nn.SoftmaxCrossEntropyWithLogits

class mindspore.nn.SoftmaxCrossEntropyWithLogits(
    sparse=False,
    reduction='none'
)(logits, labels)

更多内容详见mindspore.nn.SoftmaxCrossEntropyWithLogits

使用方式

TensorFlow:labels和logits的shape需一致,未提供reduction参数对loss求mean或sum。

MindSpore:支持labels是稀疏矩阵,且通过reduction参数可对loss求mean或sum。

代码示例

# The following implements SoftmaxCrossEntropyWithLogits with MindSpore.
import numpy as np
import tensorflow as tf
import mindspore.nn as nn
import mindspore as ms

loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True, reduction='sum')
logits = ms.Tensor(np.array([[3, 5, 6, 9], [42, 12, 32, 72]]), ms.float32)
labels_np = np.array([1, 0]).astype(np.int32)
labels = ms.Tensor(labels_np)
output = loss(logits, labels)
print(output)
# Out:
# 34.068203


# The following implements softmax_cross_entropy_with_logits with TensorFlow.
logits = tf.constant([[3, 5, 6, 9], [42, 12, 32, 72]], dtype=tf.float32)
labels = tf.constant([[0, 1, 0, 0], [1, 0, 0, 0]], dtype=tf.float32)
output = tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits)
ss = tf.Session()
ss.run(output)
# out
# array([ 4.068202, 30.  ], dtype=float32)