mindspore.mint.nn.ELU

View Source On AtomGit
class mindspore.mint.nn.ELU(alpha=1.0, inplace=False)[source]

Exponential Linear Unit activation function

Applies the exponential linear unit function element-wise. The activation function is defined as:

\[ELU_{i} = \begin{cases} x_i, &\text{if } x_i \geq 0; \cr \alpha * (\exp(x_i) - 1), &\text{otherwise.} \end{cases}\]

where \(x_i\) represents the element of the input and \(\alpha\) represents the alpha parameter, and this parameter represents the smoothness of the ELU.

ELU Activation Function Graph:

../../_images/ELU.png

Warning

This is an experimental API that is subject to change or deletion.

Parameters
  • alpha (float, optional) – The alpha value of ELU formulation. Default: 1.0.

  • inplace (bool, optional) – Whether to do the operation in-place. Default: False.

Inputs:
  • input (Tensor) - The input tensor of any dimension.

Outputs:

Tensor, with the same shape and type as the input.

Raises

RuntimeError – If the dtype of input is not float16, float32 or bfloat16.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> import numpy as np
>>> input = mindspore.tensor(np.array([-1, -2, 0, 2, 1]), mindspore.float32)
>>> elu = mindspore.mint.nn.ELU()
>>> result = elu(input)
>>> print(result)
[-0.63212055  -0.86466473  0.  2.  1.]