mindspore.mint.nn.ReLU6
- class mindspore.mint.nn.ReLU6(inplace=False)[source]
Apply ReLU6 (rectified linear unit capped at 6) element-wise.
Refer to
mindspore.mint.nn.functional.relu6()for more details.ReLU6 Activation Function Graph:
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> relu6 = mindspore.mint.nn.ReLU6(inplace=True) >>> input_tensor = mindspore.tensor([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]], mindspore.float32) >>> output = relu6(input_tensor) >>> print(output) [[0. 4. 0.] [2. 0. 6.]]