mindspore.nn.Softmin

class mindspore.nn.Softmin(axis=- 1)[source]

Softmin activation function, which is a two-category function mindspore.nn.Sigmoid in the promotion of multi-classification, and the purpose is to show the results of multi-classification in the form of probability.

Calculate the value of the exponential function for the elements of the input Tensor on the axis, and then normalized to lie in range [0, 1] and sum up to 1.

Softmin is defined as:

\[\text{softmin}(x_{i}) = \frac{\exp(-x_i)}{\sum_{j=0}^{n-1}\exp(-x_j)},\]

where \(x_{i}\) is the \(i\)-th slice in the given dimension of the input Tensor.

Parameters

axis (Union[int, tuple[int]]) – The axis to apply Softmin operation, if the dimension of input x is x.ndim, the range of axis is [-x.ndim, x.ndim). -1 means the last dimension. Default: -1.

Inputs:
  • x (Tensor) - Tensor for computing Softmin functions with data type of float16 or float32.

Outputs:

Tensor, which has the same type and shape as x with values in the range [0,1].

Raises
  • TypeError – If axis is neither an int nor a tuple.

  • TypeError – If dtype of x is neither float16 nor float32.

  • ValueError – If axis is a tuple whose length is less than 1.

  • ValueError – If axis is a tuple whose elements are not all in the range [-x.ndim, x.ndim).

Supported Platforms:

Ascend GPU CPU

Examples

>>> # axis = -1(default), and the sum of return value is 1.0.
>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16)
>>> softmin = nn.Softmin()
>>> output = softmin(x)
>>> print(output)
[0.2341  0.636  0.0862  0.01165  0.03168 ]
>>> assert(1.0 == output.sum())