SELU#
- class torch.nn.modules.activation.SELU(inplace=False)[source]#
Applies the SELU function element-wise.
SELU(x)=scale∗(max(0,x)+min(0,α∗(exp(x)−1)))with α=1.6732632423543772848170429916717 and scale=1.0507009873554804934193349852946.
Warning
When using
kaiming_normal
orkaiming_normal_
for initialisation,nonlinearity='linear'
should be used instead ofnonlinearity='selu'
in order to get Self-Normalizing Neural Networks. Seetorch.nn.init.calculate_gain()
for more information.More details can be found in the paper Self-Normalizing Neural Networks .
- Parameters
inplace (bool, optional) – can optionally do the operation in-place. Default:
False
- Shape:
Input: (∗), where ∗ means any number of dimensions.
Output: (∗), same shape as the input.
Examples:
>>> m = nn.SELU() >>> input = torch.randn(2) >>> output = m(input)