mindspore.ops.ReLUV2

class mindspore.ops.ReLUV2[source]

Rectified Linear Unit activation function.

It returns element-wise \(\max(0, x)\), specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.

\[\text{ReLU}(x) = (x)^+ = \max(0, x)\]

Note

The difference from ReLu is that the operator will output one more Mask, and the kernel of the operator is different from ReLu.

Inputs:
  • input_x (Tensor) - The input tensor must be a 4-D tensor.

Outputs:
  • output (Tensor) - Has the same type and shape as the input_x.

  • mask (Tensor) - A tensor whose data type must be uint8.

Raises
Supported Platforms:

Ascend

Examples

>>> input_x = Tensor(np.array([[[[1, -2], [-3, 4]], [[-5, 6], [7, -8]]]]), mindspore.float32)
>>> relu_v2 = ops.ReLUV2()
>>> output, mask= relu_v2(input_x)
>>> print(output)
[[[[1. 0.]
   [0. 4.]]
  [[0. 6.]
   [7. 0.]]]]
>>> print(mask)
[[[[[1 0]
    [2 0]]
   [[2 0]
    [1 0]]]]]