mindspore.ops.GeLU

class mindspore.ops.GeLU[source]

Gaussian Error Linear Units activation function.

GeLU is described in the paper Gaussian Error Linear Units (GELUs). And also please refer to BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding.

GeLU is defined as follows:

\[GELU(x_i) = x_i*P(X < x_i)\]

where \(P\) is the cumulative distribution function of the standard Gaussian distribution, \(x_i\) is the input element.

Inputs:
  • x (Tensor) - The input of the activation function GeLU, the data type is float16 or float32.

Outputs:

Tensor, with the same type and shape as x.

Raises
  • TypeError – If x is not a Tensor.

  • TypeError – If dtype of x is neither float16 nor float32.

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.array([1.0, 2.0, 3.0]), mindspore.float32)
>>> gelu = ops.GeLU()
>>> result = gelu(x)
>>> print(result)
[0.841192  1.9545976  2.9963627]