tf.keras.activations.softplus
| View source on GitHub |
Softplus activation function, softplus(x) = log(exp(x) + 1).
tf.keras.activations.softplus(
x
)
Example Usage:
a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32)
b = tf.keras.activations.softplus(a)
b.numpy()
array([2.0611537e-09, 3.1326166e-01, 6.9314718e-01, 1.3132616e+00,
2.0000000e+01], dtype=float32)
| Arguments | |
|---|---|
x | Input tensor. |
| Returns | |
|---|---|
The softplus activation: log(exp(x) + 1). |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/activations/softplus