ELU#
- class torch.nn.modules.activation.ELU(alpha=1.0, inplace=False)[source]#
Applies the Exponential Linear Unit (ELU) function, element-wise.
Method described in the paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs).
ELU is defined as:
ELU(x)={x,α∗(exp(x)−1), if x>0 if x≤0- Parameters
- Shape:
Input: (∗), where ∗ means any number of dimensions.
Output: (∗), same shape as the input.
Examples:
>>> m = nn.ELU() >>> input = torch.randn(2) >>> output = m(input)