mindspore.mint.var

View Source On AtomGit
mindspore.mint.var(input, dim=None, *, correction=1, keepdim=False)[source]

Calculate the variance over the dimensions specified by dim. dim can be a single dimension, list of dimensions, or None to reduce over all dimensions.

The variance (\(\delta ^2\)) is calculated as:

\[\delta ^2 = \frac{1}{\max(0, N - \delta N)}\sum^{N - 1}_{i = 0}(x_i - \bar{x})^2\]

where \(x\) is the sample set of elements, \(\bar{x}\) is the sample mean, \(N\) is the number of samples and \(\delta N\) is the correction.

Warning

This is an experimental API that is subject to change or deletion.

Parameters
  • input (Tensor) – The input tensor.

  • dim (None, int, tuple(int), optional) – Specify the dimension for computation. If None , compute all elements in the input . Default None.

Keyword Arguments
  • correction (int, optional) – The difference between the sample size and sample degrees of freedom. Defaults to Bessel's correction, with a value of 1.

  • keepdim (bool, optional) – Whether the output tensor has dim retained. Default False.

Returns

Tensor

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> input = mindspore.tensor([[8, 2, 1], [5, 9, 3], [4, 6, 7]], mindspore.float32)
>>> output = mindspore.mint.var(input, dim=0, correction=1, keepdim=True)
>>> print(output)
[[ 4.333333, 12.333333, 9.333333]]