mindspore.nn.RReLU

class mindspore.nn.RReLU(lower=1 / 8, upper=1 / 3)[source]

Randomized Leaky ReLU activation function.

The activation function is defined as:

\[\text{RReLU}(x_{ji}) = \begin{cases}x_{ji}, &\text{if } x_{ji} \geq 0; \cr {\alpha_{ji}} * x_{ji}, &\text{otherwise.}\end{cases}\]

where \(\alpha_{ji}\) ~ \(U(l, u)\), \(l \le u\).

Applies the RReLU function elementally, as described in the paper: Empirical Evaluation of Rectified Activations in Convolution Network .

Parameters
  • lower (Union[int, float]) – Slope of the activation function at x < 0. Default: 1/8.

  • upper (Union[int, float]) – Slope of the activation function at x < 0. Default: 1/3.

Inputs:
  • x (Tensor) - The input of RReLU is a Tensor of any dimension.

Outputs:

Tensor, after RReLU, has the same type and shape as the x.

Raises
  • TypeError – If lower is not a float or an int.

  • TypeError – If upper is not a float or an int.

  • TypeError – If x is not a Tensor.

  • TypeError – If x is not a Tensor of mindspore.float16 or mindpore.float32.

  • ValueError – If lower is greater than upper.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import mindspore.nn as nn
>>> from mindspore import Tensor
>>> import numpy as np
>>> x = Tensor(np.array([[-1.0, 4.0], [2.0, 0]]), mindspore.float32)
>>> r_relu = nn.RReLU()
>>> output = r_relu(x)
>>> print(output)
[[-0.31465699  4.        ]
 [ 2.          0.        ]]