GELU

class torch.nn.GELU [source]

Applies the Gaussian Error Linear Units function:

GELU(x)=xΦ(x)\text{GELU}(x) = x * \Phi(x)

where Φ(x)\Phi(x) is the Cumulative Distribution Function for Gaussian Distribution.

Shape:
  • Input: (N,)(N, *) where * means, any number of additional dimensions
  • Output: (N,)(N, *) , same shape as the input
../_images/GELU.png

Examples:

>>> m = nn.GELU()
>>> input = torch.randn(2)
>>> output = m(input)

© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.GELU.html