RReLU
-
class torch.nn.RReLU(lower=0.125, upper=0.3333333333333333, inplace=False)
[source] -
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
Empirical Evaluation of Rectified Activations in Convolutional Network.
The function is defined as:
where is randomly sampled from uniform distribution .
See: https://arxiv.org/pdf/1505.00853.pdf
- Parameters
-
- lower – lower bound of the uniform distribution. Default:
- upper – upper bound of the uniform distribution. Default:
-
inplace – can optionally do the operation in-place. Default:
False
- Shape:
-
- Input: where
*
means, any number of additional dimensions - Output: , same shape as the input
- Input: where
Examples:
>>> m = nn.RReLU(0.1, 0.3) >>> input = torch.randn(2) >>> output = m(input)
© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.RReLU.html