mindspore.common.initializer

Initializer for cell parameters.

class mindspore.common.initializer.Constant(value)[source]

Generates an array with constant value in order to initialize a tensor.

Parameters

value (Union[int, numpy.ndarray]) – The value to initialize.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer
>>> tensor1 = initializer(0, [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer(5, [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.Dirac(groups=1)[source]

Generates an array with the Dirac delta function in order to initialize a tensor. It tries to preserves the identity of input for convolution layers. For group convolution, each group of channels will be preserved respectively.

Parameters

groups (int) – The number of group in convolution layer. Default: 1.

Raises
  • ValueError – If the dimension of the initialized tensor is not in [3, 4, 5].

  • ValueError – The first dimension of the initialized tensor cannot be divisible by group.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Dirac
>>> tensor1 = initializer(Dirac(groups=2), [6, 4, 3, 3], mindspore.float32)
>>> tensor2 = initializer("dirac", [6, 4, 3, 3], mindspore.float32)
class mindspore.common.initializer.HeNormal(negative_slope=0, mode='fan_in', nonlinearity='leaky_relu')[source]

Generates an array with values sampled from HeKaiming Normal distribution \({N}(0, \text{sigma}^2)\) in order to initialize a tensor, where

\[sigma = \frac{gain} {\sqrt{fan\_mode}}\]

where \(gain\) is an optional scaling factor. \(fan\_mode\) is the number of input or output units of the weight tensor, depending on the mode is ‘fan_in’ or ‘fan_out’.

For details of HeUniform algorithm, please check https://arxiv.org/abs/1502.01852.

Parameters
  • negative_slope (int, float) – The negative slope of the rectifier used after this layer (only used when nonlinearity is ‘leaky_relu’). Default: 0.

  • mode (str) – Either ‘fan_in’ or ‘fan_out’. Choosing ‘fan_in’ preserves the magnitude of the variance of the weights in the forward pass. Choosing ‘fan_out’ preserves the magnitudes in the backwards pass. Default: ‘fan_in’.

  • nonlinearity (str) – The non-linear function, recommended to use only with ‘relu’ or ‘leaky_relu’. Default: ‘leaky_relu’.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, HeNormal
>>> tensor1 = initializer(HeNormal(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('he_normal', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.HeUniform(negative_slope=0, mode='fan_in', nonlinearity='leaky_relu')[source]

Generates an array with values sampled from HeKaiming Uniform distribution \({U}(-\text{boundary}, \text{boundary})\) in order to initialize a tensor, where

\[boundary = \text{gain} \times \sqrt{\frac{3}{fan\_mode}}\]

where \(gain\) is an optional scaling factor. If \(fan\_mode\) is ‘fan_in’, it is the number of input units of the weight tensor. If \(fan\_mode\) is ‘fan_out’, it is the number of output units of the weight tensor.

For details of HeUniform algorithm, please check https://arxiv.org/abs/1502.01852.

Parameters
  • negative_slope (int, float, bool) – The negative slope of the rectifier used after this layer (only used when nonlinearity is ‘leaky_relu’). Default: 0.

  • mode (str) – Either ‘fan_in’ or ‘fan_out’. Choosing ‘fan_in’ preserves the magnitude of the variance of the weights in the forward pass. Choosing ‘fan_out’ preserves the magnitudes in the backwards pass. Default: ‘fan_in’.

  • nonlinearity (str) – The non-linear function, recommended to use only with ‘relu’ or ‘leaky_relu’. Default: ‘leaky_relu’.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, HeUniform
>>> tensor1 = initializer(HeUniform(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('he_uniform', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.Identity(**kwargs)[source]

Generates a 2 dimension identity matrix array in order to initialize a tensor.

Raises

ValueError – If the dimension of input tensor is not equal to 2.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Identity
>>> tensor1 = initializer(Identity(), [2, 3], mindspore.float32)
>>> tensor2 = initializer('identity', [2, 3], mindspore.float32)
class mindspore.common.initializer.Initializer(**kwargs)[source]

The abstract base class of the initializer.

Parameters

kwargs (dict) – Keyword arguments for Initializer.

class mindspore.common.initializer.Normal(sigma=0.01, mean=0.0)[source]

Generates an array with values sampled from Normal distribution \({N}(\text{sigma}, \text{mean})\) in order to initialize a tensor.

\[f(x) = \frac{1} {\sqrt{2*π} * sigma}exp(-\frac{(x - mean)^2} {2*{sigma}^2})\]
Parameters
  • sigma (float) – The standard deviation of Normal distribution. Default: 0.01.

  • mean (float) – The mean of Normal distribution. Default: 0.0.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Normal
>>> tensor1 = initializer(Normal(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('normal', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.One(**kwargs)[source]

Generates an array with constant value of one in order to initialize a tensor.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, One
>>> tensor1 = initializer(One(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('ones', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.Orthogonal(gain=1.0)[source]

Generates a (semi) orthogonal matrix array in order to initialize a tensor. The dimension of input tensor must have at least 2 dimensions. If the dimension is greater than 2, the trailing dimensions will be flattened.

Parameters

gain (float) – An optional scaling factor. Default: 1.

Raises

ValueError – If the dimension of input tensor is less than 2.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Orthogonal
>>> tensor1 = initializer(Orthogonal(gain=2.), [2, 3, 4], mindspore.float32)
>>> tensor2 = initializer('orthogonal', [2, 3, 4], mindspore.float32)
class mindspore.common.initializer.Sparse(sparsity, sigma=0.01)[source]

Generates a 2 dimension sparse matrix array in order to initialize a tensor. The non-zero positions will be filled with the value sampled from the normal distribution \({N}(0, 0.01)\)

Parameters
  • sparsity (float) – The fraction of elements being set to zero in each column.

  • sigma (float) – The standard deviation of the normal distribution. Default: 0.01.

Raises

ValueError – If the dimension of input tensor is not equal to 2.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Sparse
>>> tensor1 = initializer(Sparse(sparsity=0.1, sigma=0.01), [5, 8], mindspore.float32)
class mindspore.common.initializer.TruncatedNormal(sigma=0.01)[source]

Generates an array with values sampled from Truncated Normal distribution in order to initialize a tensor.

Parameters

sigma (float) – The standard deviation of Truncated Normal distribution. Default: 0.01.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, TruncatedNormal
>>> tensor1 = initializer(TruncatedNormal(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('truncatedNormal', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.Uniform(scale=0.07)[source]

Generates an array with values sampled from Uniform distribution \({U}(-\text{scale}, \text{scale})\) in order to initialize a tensor.

Parameters

scale (float) – The bound of the Uniform distribution. Default: 0.07.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Uniform
>>> tensor1 = initializer(Uniform(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('uniform', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.VarianceScaling(scale=1.0, mode='fan_in', distribution='truncated_normal')[source]

Generates an random array with scaling in order to initialize a tensor. When distribution is ‘truncated_normal’ or ‘untruncated_normal’, the value will be sampled from truncated or untruncated normal distribution with a mean of 0 and a scaled standard deviation \(stddev = \sqrt{\frac{scale}{n}}\). \(n\) will be the number of input units if mode is ‘fan_in’, the number of output units if mode is ‘fan_out’, the average of ‘fan_in’ and ‘fan_out’ if mode is ‘fan_avg’. When distribution is ‘uniform’, the value will be sampled from a uniform distribution within the limit of \([-\sqrt{\frac{3*scale}{n}}, \sqrt{\frac{3*scale}{n}}]\).

Parameters
  • scale (float) – The scaling factor. Default: 1.0.

  • mode (str) – Should be ‘fan_in’, ‘fan_out’ or ‘fan_avg’. Default: ‘fan_in’.

  • distribution (str) – The type of distribution chose to sample values. It should be ‘uniform’, ‘truncated_normal’ or ‘untruncated_normal’. Default: ‘truncated_normal’.

Raises
  • ValueError – If scale is not greater than 0.

  • ValueError – If mode is not ‘fan_in’, ‘fan_out’ or ‘fan_avg’.

  • ValueError – If distribution is not ‘uniform’, ‘truncated_normal’ or ‘untruncated_normal’.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, VarianceScaling
>>> tensor1 = initializer(VarianceScaling(scale=1.0, mode='fan_out',
...                                       distribution='untruncated_normal'), [2, 3], mindspore.float32)
>>> tensor2 = initializer('varianceScaling', [2, 3], mindspore.float32)
class mindspore.common.initializer.XavierUniform(gain=1)[source]

Generates an array with values sampled from Xavier uniform distribution \({U}(-\text{boundary}, \text{boundary})\) in order to initialize a tensor, where

\[boundary = gain * \sqrt{\frac{6}{n_{in} + n_{out}}}\]

where \(gain\) is an optional scaling factor, \(n_{in}\) is the number of input units in the weight tensor, \(n_{out}\) is the number of output units in the weight tensor.

For details of XavierUniform algorithm, please check http://proceedings.mlr.press/v9/glorot10a.html.

Parameters

gain (float) – An optional scaling factor. Default: 1.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, XavierUniform
>>> tensor1 = initializer(XavierUniform(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('xavier_uniform', [1, 2, 3], mindspore.float32)
class mindspore.common.initializer.Zero(**kwargs)[source]

Generates an array with constant value of zero in order to initialize a tensor.

Examples

>>> import mindspore
>>> from mindspore.common.initializer import initializer, Zero
>>> tensor1 = initializer(Zero(), [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('zeros', [1, 2, 3], mindspore.float32)
mindspore.common.initializer.initializer(init, shape=None, dtype=mstype.float32)[source]

Create and initialize a tensor.

Parameters
  • init (Union[Tensor, str, Initializer, numbers.Number]) –

    Initialize value.

    • str: The init should be the alias of the class inheriting from Initializer and the corresponding class will be called in practice. The value of ‘init’ can be “normal”, “ones” or “zeros”, etc.

    • Initializer: The init should be the class inheriting from Initializer to initialize tensor.

    • numbers.Number: The Constant will be called to initialize tensor.

  • shape (Union[tuple, list, int]) – The shape of the initialized tensor. Default: None.

  • dtype (mindspore.dtype) – The type of data in initialized tensor. Default: mindspore.float32.

Returns

Tensor, return is Tensor object.

Raises
  • TypeError – The type of the argument ‘init’ is not correct.

  • ValueError – The shape of the tensor which is passed through ‘init’ is not the same as that passed by ‘shape’.

Examples

>>> import numpy as np
>>> import mindspore
>>> from mindspore import Tensor
>>> from mindspore.common.initializer import initializer, One
>>> data = Tensor(np.zeros([1, 2, 3]), mindspore.float32)
>>> tensor1 = initializer(data, [1, 2, 3], mindspore.float32)
>>> tensor2 = initializer('ones', [1, 2, 3], mindspore.float32)
>>> tensor3 = initializer(One(), [1, 2, 3], mindspore.float32)
>>> tensor4 = initializer(0, [1, 2, 3], mindspore.float32)