mindspore.ops.leaky_relu

View Source On Gitee
mindspore.ops.leaky_relu(input, alpha=0.2)[source]

leaky_relu activation function. The element of input less than 0 times alpha .

The activation function is defined as:

\[\text{leaky_relu}(input) = \begin{cases}input, &\text{if } input \geq 0; \cr {\alpha} * input, &\text{otherwise.}\end{cases}\]

where \(\alpha\) represents the alpha parameter.

For more details, see Rectifier Nonlinearities Improve Neural Network Acoustic Models.

LeakyReLU Activation Function Graph:

../../_images/LeakyReLU.png
Parameters
  • input (Tensor) – The input of leaky_relu is a Tensor of any dimension.

  • alpha (Union[int, float]) – Slope of the activation function when the element of input is less than 0. Default: 0.2 .

Returns

Tensor, has the same type and shape as the input.

Raises
  • TypeError – If input is not a Tensor.

  • TypeError – If alpha is not a float or an int.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> print(ops.leaky_relu(x, alpha=0.2))
[[-0.2  4.  -1.6]
 [ 2.  -1.   9. ]]