mindspore.mint.nn.Dropout

View Source On AtomGit
class mindspore.mint.nn.Dropout(p=0.5, inplace=False)[source]

Dropout layer for the input.

Dropout is a means of regularization that reduces overfitting by preventing correlations between neuronal nodes. The operator randomly sets some neurons output to 0 according to p, which means the probability of discarding during training. And the return will be multiplied by \(\frac{1}{1-p}\) during training. During the reasoning, this layer returns the same Tensor as the input.

This technique is proposed in paper Dropout: A Simple Way to Prevent Neural Networks from Overfitting and proved to be effective to reduce over-fitting and prevents neurons from co-adaptation. See more details in Improving neural networks by preventing co-adaptation of feature detectors.

Note

  • Each channel will be zeroed out independently on every construct call.

  • Parameter p means the probability of the element of the input tensor to be zeroed.

Parameters
  • p (float, optional) – Probability of an element to be zeroed. Default: 0.5.

  • inplace (bool, optional) – Whether to enable the operation in-place. Default: False.

Inputs:
  • input (Tensor) - The input tensor.

Outputs:

Tensor.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> x = mindspore.tensor(mindspore.mint.ones([2, 2, 3]), mindspore.float32)
>>> net = mindspore.mint.nn.Dropout(p=0.2)
>>> net.set_train()
>>> output = net(x)
>>> print(output.shape)
(2, 2, 3)