Rate this Page

PReLU#

class torch.nn.PReLU(num_parameters=1, init=0.25, device=None, dtype=None)[source]#

Applies the element-wise PReLU function.

PReLU(x)=max(0,x)+amin(0,x)

or

PReLU(x)={x,ax, if x0 otherwise 

Here a is a learnable parameter. When called without arguments, nn.PReLU() uses a single parameter a across all input channels. If called with nn.PReLU(nChannels), a separate a is used for each input channel.

Note

weight decay should not be used when learning a for good performance.

Note

Channel dim is the 2nd dim of input. When input has dims < 2, then there is no channel dim and the number of channels = 1.

Parameters
  • num_parameters (int) – number of a to learn. Although it takes an int as input, there is only two values are legitimate: 1, or the number of channels at input. Default: 1

  • init (float) – the initial value of a. Default: 0.25

Shape:
  • Input: () where * means, any number of additional dimensions.

  • Output: (), same shape as the input.

Variables

weight (Tensor) – the learnable weights of shape (num_parameters).

../_images/PReLU.png

Examples:

>>> m = nn.PReLU()
>>> input = torch.randn(2)
>>> output = m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor

reset_parameters()[source]#

Resets parameters based on their initialization used in __init__.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
Morty Proxy This is a proxified and sanitized view of the page, visit original site.