Rate this Page

SiLU#

class torch.nn.modules.activation.SiLU(inplace=False)[source]#

Applies the Sigmoid Linear Unit (SiLU) function, element-wise.

The SiLU function is also known as the swish function.

silu(x)=xσ(x),where σ(x) is the logistic sigmoid.

Note

See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.

Shape:
  • Input: (), where means any number of dimensions.

  • Output: (), same shape as the input.

../_images/SiLU.png

Examples:

>>> m = nn.SiLU()
>>> input = torch.randn(2)
>>> output = m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.