mindspore.ops.softmin

mindspore.ops.softmin(x, axis=- 1, *, dtype=None)[source]

Applies the Softmin operation to the input tensor on the specified axis.

Warning

After version 2.9.0, the parameter axis will be renamed to dim, and the default value will change from -1 to None.

Suppose a slice in the given axis \(x\), then for each element \(x_i\), the Softmin function is shown as follows:

\[\text{output}(x_i) = \frac{\exp(-x_i)}{\sum_{j = 0}^{N-1}\exp(-x_j)},\]

where \(N\) is the length of the tensor.

Parameters
  • x (Tensor) – Tensor of shape \((N, *)\), where \(*\) means any number of additional dimensions, with float16 or float32 data type.

  • axis (Union[int, tuple[int]], optional) – The axis to perform the Softmin operation. Default: -1 .

Keyword Arguments

dtype (mindspore.dtype, optional) – When set, x will be converted to the specified type, dtype, before execution, and dtype of returned Tensor will also be dtype. Default: None .

Returns

Tensor, with the same type and shape as x.

Raises
  • TypeError – If axis is not an int or a tuple.

  • TypeError – If dtype of x is neither float16 nor float32.

  • ValueError – If axis is a tuple whose length is less than 1.

  • ValueError – If axis is a tuple whose elements are not all in range [-len(x.shape), len(x.shape)).

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> x = Tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float16)
>>> output = ops.softmin(x)
>>> print(output)
[0.2341  0.636  0.0862  0.01165  0.03168 ]