Rate this Page

LeakyReLU#

class torch.nn.LeakyReLU(negative_slope=0.01, inplace=False)[source]#

Applies the LeakyReLU function element-wise.

LeakyReLU(x)=max(0,x)+negative_slopemin(0,x)

or

LeakyReLU(x)={x,negative_slope×x, if x0 otherwise 
Parameters
  • negative_slope (float) – Controls the angle of the negative slope (which is used for negative input values). Default: 1e-2

  • inplace (bool) – can optionally do the operation in-place. Default: False

Shape:
  • Input: () where * means, any number of additional dimensions

  • Output: (), same shape as the input

../_images/LeakyReLU.png

Examples:

>>> m = nn.LeakyReLU(0.1)
>>> input = torch.randn(2)
>>> output = m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Run forward pass.

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.