mindspore.ops.glu

View Source On Gitee
mindspore.ops.glu(x, axis=- 1)[source]

Computes GLU (Gated Linear Unit activation function) of input tensors.

\[{GLU}(a, b)= a \otimes \sigma(b)\]

where \(a\) is the first half of the input matrices and \(b\) is the second half.

Here \(\sigma\) is the sigmoid function, and \(\otimes\) is the Hadamard product. See Language Modeling with Gated Convluational Networks.

Parameters
  • x (Tensor) – Tensor to be split. Its dtype is Number, and shape is \((\ast_1, N, \ast_2)\) where * means, any number of additional dimensions.

  • axis (int, optional) – the axis to split the input. It must be int. Default: -1 , the last axis of x.

Returns

Tensor, the same dtype as the x, with the shape \((\ast_1, M, \ast_2)\) where \(M=N/2\).

Raises
Supported Platforms:

Ascend GPU CPU

Examples

>>> from mindspore import Tensor, ops
>>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]])
>>> output = ops.glu(input)
>>> print(output)
[[0.05744425 0.11973753]
 [0.33409387 0.41398472]]