mindspore.ops.ReLUV2

class mindspore.ops.ReLUV2[source]

The ReLUV2 interface is deprecated, please use the mindspore.ops.ReLU instead.

Rectified Linear Unit activation function.

It returns element-wise \(\max(0, x)\), specially, the neurons with the negative output will be suppressed and the active neurons will stay the same.

\[\text{ReLU}(x) = (x)^+ = \max(0, x)\]
Inputs:
  • input_x (Tensor) - The input tensor must be a 4-D tensor.

Outputs:
  • output (Tensor) - Has the same type and shape as the input_x.

  • mask (Tensor) - A tensor, but it is meaningless.

Raises
Supported Platforms:

deprecated

Examples

>>> input_x = Tensor(np.array([[[[1, -2], [-3, 4]], [[-5, 6], [7, -8]]]]), mindspore.float32)
>>> relu_v2 = ops.ReLUV2()
>>> output, _= relu_v2(input_x)
>>> print(output)
[[[[1. 0.]
   [0. 4.]]
  [[0. 6.]
   [7. 0.]]]]