ReLU6#
- class torch.ao.nn.quantized.ReLU6(inplace=False)[source]#
Applies the element-wise function:
ReLU6(x)=min(max(x0,x),q(6)), where x0 is the zero_point, and q(6) is the quantized representation of number 6.
- Parameters
inplace (bool) – can optionally do the operation in-place. Default:
False
- Shape:
Input: (N,∗) where * means, any number of additional dimensions
Output: (N,∗), same shape as the input
Examples:
>>> m = nn.quantized.ReLU6() >>> input = torch.randn(2) >>> input = torch.quantize_per_tensor(input, 1.0, 0, dtype=torch.qint32) >>> output = m(input)