Rate this Page

GLU#

class torch.nn.modules.activation.GLU(dim=-1)[source]#

Applies the gated linear unit function.

GLU(a,b)=aσ(b) where a is the first half of the input matrices and b is the second half.

Parameters

dim (int) – the dimension on which to split the input. Default: -1

Shape:
  • Input: (1,N,2) where * means, any number of additional dimensions

  • Output: (1,M,2) where M=N/2

../_images/GLU.png

Examples:

>>> m = nn.GLU()
>>> input = torch.randn(4, 2)
>>> output = m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.