mindspore.nn.Moments

class mindspore.nn.Moments(axis=None, keep_dims=None)[source]

Calculates the mean and variance of x.

Parameters
  • axis (Union[int, tuple(int)]) – Calculates the mean and variance along the specified axis. Default: ().

  • keep_dims (bool) – If true, The dimension of mean and variance are identical with input’s. If false, don’t keep these dimensions. Default: False.

Inputs:
  • input_x (Tensor) - The tensor to be calculated. Only float16 and float32 are supported.

Outputs:
  • mean (Tensor) - The mean of input x, with the same date type as input x.

  • variance (Tensor) - The variance of input x, with the same date type as input x.

Raises
  • TypeError – If axis is not one of int, tuple, None.

  • TypeError – If keep_dims is neither bool nor None.

  • TypeError – If dtype of input_x is neither float16 nor float32.

Supported Platforms:

Ascend GPU

Examples

>>> net = nn.Moments(axis=3, keep_dims=True)
>>> input_x = Tensor(np.array([[[[1, 2, 3, 4], [3, 4, 5, 6]]]]), mindspore.float32)
>>> output = net(input_x)
>>> print(output)
(Tensor(shape=[1, 1, 2, 1], dtype=Float32, value=
[[[[ 2.50000000e+00],
   [ 4.50000000e+00]]]]), Tensor(shape=[1, 1, 2, 1], dtype=Float32, value=
[[[[ 1.25000000e+00],
   [ 1.25000000e+00]]]]))