LeakyReLU#
- class torch.nn.LeakyReLU(negative_slope=0.01, inplace=False)[source]#
Applies the LeakyReLU function element-wise.
LeakyReLU(x)=max(0,x)+negative_slope∗min(0,x)or
LeakyReLU(x)={x,negative_slope×x, if x≥0 otherwise - Parameters
- Shape:
Input: (∗) where * means, any number of additional dimensions
Output: (∗), same shape as the input
Examples:
>>> m = nn.LeakyReLU(0.1) >>> input = torch.randn(2) >>> output = m(input)