tf.keras.layers.ReLU

View source on GitHub

Rectified Linear Unit activation function.

Inherits From: Layer

With default values, it returns element-wise max(x, 0).

Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = negative_slope * (x - threshold) otherwise.

Input shape:

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape:

Same shape as the input.

Arguments
max_value Float >= 0. Maximum activation value.
negative_slope Float >= 0. Negative slope coefficient.
threshold Float. Threshold value for thresholded activation.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/layers/ReLU