mindspore.ops.mish

View Source On Gitee
mindspore.ops.mish(x)[source]

Computes MISH(A Self Regularized Non-Monotonic Neural Activation Function) of input tensors element-wise.

The function is shown as follows:

\[\text{output} = x * \tanh(\log(1 + \exp(\text{x})))\]

See more details in A Self Regularized Non-Monotonic Neural Activation Function.

Mish Activation Function Graph:

../../_images/Mish.png
Parameters

x (Tensor) –

The input Tensor. Supported dtypes:

  • GPU/CPU: float16, float32, float64.

  • Ascend: float16, float32.

Returns

Tensor, with the same type and shape as the x.

Raises

TypeError – If dtype of x is not float16, float32 or float64.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> input_x = Tensor(np.array([[-1.0, 4.0, -8.0], [2.0, -5.0, 9.0]]), mindspore.float32)
>>> output = ops.mish(input_x)
>>> print(output)
[[-3.0340147e-01  3.9974129e+00 -2.68311895e-03]
 [ 1.9439590e+00  -3.3576239e-02 8.99999990e+00]]
>>> input_x = Tensor(2.1, mindspore.float32)
>>> output = ops.mish(input_x)
>>> print(output)
2.050599