mindspore.nn.ReLU

View Source On AtomGit
class mindspore.nn.ReLU[source]

Apply ReLU (rectified linear unit activation function) element-wise.

\[\text{ReLU}(input) = (input)^+ = \max(0, input),\]

Compute the element-wise \(\max(0, input)\).

Note

Neurons with negative outputs are suppressed, while active neurons stay the same.

ReLU Activation Function Graph:

../../_images/ReLU.png
Inputs:
  • input (Tensor) - The input tensor of any dimension.

Outputs:

Tensor.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> relu = mindspore.nn.ReLU()
>>> input_tensor = mindspore.tensor([-1.0, 2.0, -3.0, 2.0, -1.0], mindspore.float32)
>>> output = relu(input_tensor)
>>> print(output)
[0. 2. 0. 2. 0.]