Rate this Page

torch.nn.functional.glu#

torch.nn.functional.glu(input, dim=-1) Tensor[source]#

The gated linear unit. Computes:

GLU(a,b)=aσ(b)

where input is split in half along dim to form a and b, σ is the sigmoid function and is the element-wise product between matrices.

See Language Modeling with Gated Convolutional Networks.

Parameters
  • input (Tensor) – input tensor

  • dim (int) – dimension on which to split the input. Default: -1

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.