SiLU
-
class torch.nn.SiLU(inplace=False)
[source] -
Applies the silu function, element-wise.
Note
See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.
- Shape:
-
- Input: where
*
means, any number of additional dimensions - Output: , same shape as the input
- Input: where
Examples:
>>> m = nn.SiLU() >>> input = torch.randn(2) >>> output = m(input)
© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.SiLU.html