mindspore.ops.glu
- mindspore.ops.glu(x, axis=- 1)[source]
Computes GLU (Gated Linear Unit activation function) of input tensors.
Warning
After version 2.9.0, the parameters x and axis will be renamed to input and dim.
\[{GLU}(a, b)= a \otimes \sigma(b)\]where \(a\) is the first half of the input matrices and \(b\) is the second half.
Here \(\sigma\) is the sigmoid function, and \(\otimes\) is the Hadamard product. See Language Modeling with Gated Convolutional Networks.
- Parameters
- Returns
Tensor, the same dtype as the x, with the shape \((\ast_1, M, \ast_2)\) where \(M=N/2\).
- Raises
- Supported Platforms:
AscendGPUCPU
Examples
>>> from mindspore import Tensor, ops >>> input = Tensor([[0.1,0.2,0.3,0.4],[0.5,0.6,0.7,0.8]]) >>> output = ops.glu(input) >>> print(output) [[0.05744425 0.11973753] [0.33409387 0.41398472]]