mindspore.nn.Softmax
- class mindspore.nn.Softmax(axis=- 1)[source]
Softmax activation function, which is a two-category function
mindspore.nn.Sigmoidin the promotion of multi-classification, the purpose is to show the results of multi-classification in the form of probability.Calculate the value of the exponential function for the elements of the input Tensor on the axis, and then normalized to lie in range [0, 1] and sum up to 1.
Softmax is defined as:
\[\text{softmax}(input_{i}) = \frac{\exp(input_i)}{\sum_{j=0}^{n-1}\exp(input_j)},\]where \(input_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.
Warning
Starting after version 2.9.0, axis will be renamed to dim, and the default value will change to align with the new interface. The signature will change to
mindspore.nn.Softmax(dim=None).- Parameters
axis (int, optional) – The axis to apply Softmax operation, if the dimension of input is input.ndim, the range of axis is [-input.ndim, input.ndim), -1 means the last dimension. Default:
-1.
- Inputs:
input (Tensor) - The input of Softmax.
- Outputs:
Tensor, which has the same type and shape as input with values in the range[0, 1].
- Raises
TypeError – If axis is neither an int nor a tuple.
ValueError – If axis is a tuple whose length is less than 1.
ValueError – If axis is a tuple whose elements are not all in range [-input.ndim, input.ndim).
- Supported Platforms:
AscendGPUCPU
Examples
>>> import mindspore >>> from mindspore import Tensor, nn >>> import numpy as np >>> # axis = -1(default), and the sum of return value is 1.0. >>> input = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16) >>> softmax = nn.Softmax() >>> output = softmax(input) >>> print(output) [0.03168 0.01166 0.0861 0.636 0.2341 ]