mindspore.mint.nn.Softshrink

View Source On AtomGit
class mindspore.mint.nn.Softshrink(lambd=0.5)[source]

Apply the Softshrink function element-wise.

\[\begin{split}\text{Softshrink}(x) = \begin{cases} x - \lambda, & \text{ if } x > \lambda \\ x + \lambda, & \text{ if } x < -\lambda \\ 0, & \text{ otherwise } \end{cases}\end{split}\]

Softshrink Activation Function Graph:

../../_images/Softshrink.png
Parameters

lambd (number, optional) – The threshold \(\lambda\) defined by the Soft Shrink formula, no less than zero. Default: 0.5.

Inputs:
  • input (Tensor) - The input tensor. Supported dtypes:

    • Ascend: mindspore.float16, mindspore.float32, mindspore.bfloat16.

Outputs:

Tensor.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> input = mindspore.tensor([[ 0.5297,  0.7871,  1.1754], [ 0.7836,  0.6218, -1.1542]], mindspore.float16)
>>> softshrink = mindspore.mint.nn.Softshrink()
>>> output = softshrink(input)
>>> print(output)
[[ 0.02979  0.287    0.676  ]
 [ 0.2837   0.1216  -0.6543 ]]