GLU#
- class torch.nn.modules.activation.GLU(dim=-1)[source]#
Applies the gated linear unit function.
GLU(a,b)=a⊗σ(b) where a is the first half of the input matrices and b is the second half.
- Parameters
dim (int) – the dimension on which to split the input. Default: -1
- Shape:
Input: (∗1,N,∗2) where * means, any number of additional dimensions
Output: (∗1,M,∗2) where M=N/2
Examples:
>>> m = nn.GLU() >>> input = torch.randn(4, 2) >>> output = m(input)