mindspore.numpy.amin(a, axis=None, keepdims=False, initial=None, where=True)[source]

Returns the minimum of an array or minimum along an axis.


Numpy argument out is not supported. On GPU, the supported dtypes are np.float16, and np.float32.

  • a (Tensor) – Input data.

  • axis (None or int or tuple of integers, optional) – Defaults to None. Axis or axes along which to operate. By default, flattened input is used. If this is a tuple of integers, the minimum is selected over multiple axes, instead of a single axis or all the axes as before.

  • keepdims (bool, optional) – Defaults to False. If this is set to True, the axes which are reduced are left in the result as dimensions with size one. With this option, the result will broadcast correctly against the input array.

  • initial (Number, optional) – Defaults to None. The maximum value of an output element. Must be present to allow computation on empty slice.

  • where (bool Tensor, optional) – Defaults to True. A boolean array which is broadcasted to match the dimensions of array, and selects elements to include in the reduction. If non-default value is passed, initial must also be provided.


Tensor or scalar, minimum of a. If axis is None, the result is a scalar value. If axis is given, the result is an array of dimension a.ndim - 1.


TypeError – If the input is not a tensor.

Supported Platforms:

Ascend GPU CPU


>>> import mindspore.numpy as np
>>> a = np.arange(4).reshape((2,2)).astype('float32')
>>> output = np.amin(a)
>>> print(output)
>>> output = np.amin(a, axis=0)
>>> print(output)
[0. 1.]
>>> output = np.amin(a, axis=1)
>>> print(output)
[0. 2.]
>>> output = np.amin(a, where=np.array([False, True]), initial=10, axis=0)
>>> print(output)
[10.  1.]