mindscience.models.layers.activation.SReLU

class mindscience.models.layers.activation.SReLU[source]

Sin rectified Linear Unit activation function.

Applies the sin rectified linear unit function element-wise.

Inputs:
  • input (Tensor) - The input of SReLU.

Outputs:
  • output (Tensor) - Tensor, with the same type and shape as the input.

Examples

>>> import numpy as np
>>> from mindscience.models.layers.activation import SReLU
>>> from mindspore import Tensor
>>> input_x = Tensor(np.array([[1.2, 0.1], [0.2, 3.2]], dtype=np.float32))
>>> srelu = SReLU()
>>> output = srelu(input_x)
>>> print(output)
[[0.         0.05290067]
 [0.15216905 0.        ]]