mindspore.mint.nn.ReLU6

View Source On AtomGit
class mindspore.mint.nn.ReLU6(inplace=False)[source]

Apply ReLU6 (rectified linear unit capped at 6) element-wise.

Refer to mindspore.mint.nn.functional.relu6() for more details.

ReLU6 Activation Function Graph:

../../_images/ReLU6.png
Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> relu6 = mindspore.mint.nn.ReLU6(inplace=True)
>>> input_tensor = mindspore.tensor([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]], mindspore.float32)
>>> output = relu6(input_tensor)
>>> print(output)
[[0. 4. 0.]
 [2. 0. 6.]]