mindspore.mint.nn.Mish
- class mindspore.mint.nn.Mish[source]
Compute MISH (A Self Regularized Non-Monotonic Neural Activation Function) activation function element-wise.
Refer to
mindspore.mint.nn.functional.mish()for more details.Mish Activation Function Graph:
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> x = mindspore.tensor([[-1.1, 4.0, -8.0], [2.0, -5.0, 9.0]], mindspore.float32) >>> mish = mindspore.mint.nn.Mish() >>> output = mish(x) >>> print(output) [[-3.0764845e-01 3.9974124e+00 -2.6832507e-03] [ 1.9439589e+00 -3.3576239e-02 8.9999990e+00]]