mindspore.ops.softshrink

View Source On Gitee
mindspore.ops.softshrink(x, lambd=0.5)[source]

Applies the Softshrink function element-wise.

\[\begin{split}\text{SoftShrink}(x) = \begin{cases} x - \lambda, & \text{ if } x > \lambda \\ x + \lambda, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]

SoftShrink Activation Function Graph:

../../_images/Softshrink.png
Parameters
  • x (Tensor) – The input of soft shrink with data type of float16 or float32.

  • lambd (float) – The \(\lambda\) must be no less than zero. Default: 0.5 .

Returns

Tensor, has the same shape and data type as x.

Raises
Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor
>>> from mindspore import ops
>>> import numpy as np
>>> x = Tensor(np.array([[ 0.5297,  0.7871,  1.1754], [ 0.7836,  0.6218, -1.1542]]), mindspore.float32)
>>> output = ops.softshrink(x)
>>> print(output)
[[ 0.02979  0.287    0.676  ]
 [ 0.2837   0.1216  -0.6543 ]]