Module: tf.contrib.opt

A module containing optimization routines.

Classes

class AGNCustomGetter: Custom_getter class is used to do:

class AGNOptimizer: Wrapper that implements the Accumulated GradientNormalization algorithm.

class AdaMaxOptimizer: Optimizer that implements the AdaMax algorithm.

class AdamGSOptimizer: Optimizer that implements the Adam algorithm.

class AdamWOptimizer: Optimizer that implements the Adam algorithm with weight decay.

class AddSignOptimizer: Optimizer that implements the AddSign update.

class DecoupledWeightDecayExtension: This class allows to extend optimizers with decoupled weight decay.

class DropStaleGradientOptimizer: Wrapper optimizer that checks and drops stale gradient.

class ElasticAverageCustomGetter: Custom_getter class is used to do:

class ElasticAverageOptimizer: Wrapper optimizer that implements the Elastic Average SGD algorithm.

class ExternalOptimizerInterface: Base class for interfaces with external optimization algorithms.

class GGTOptimizer: Optimizer that implements the GGT algorithm.

class LARSOptimizer: Layer-wise Adaptive Rate Scaling for large batch training.

class LazyAdamGSOptimizer: Variant of the Adam optimizer that handles sparse updates more efficiently.

class LazyAdamOptimizer: Variant of the Adam optimizer that handles sparse updates more efficiently.

class ModelAverageCustomGetter: Custom_getter class is used to do.

class ModelAverageOptimizer: Wrapper optimizer that implements the Model Average algorithm.

class MomentumWOptimizer: Optimizer that implements the Momentum algorithm with weight_decay.

class MovingAverageOptimizer: Optimizer that computes a moving average of the variables.

class MultitaskOptimizerWrapper: Optimizer wrapper making all-zero gradients harmless.

class NadamOptimizer: Optimizer that implements the Nadam algorithm.

class PowerSignOptimizer: Optimizer that implements the PowerSign update.

class RegAdagradOptimizer: RegAdagrad: Adagrad with updates that optionally skip updating the slots.

class ScipyOptimizerInterface: Wrapper allowing scipy.optimize.minimize to operate a tf.compat.v1.Session.

class ShampooOptimizer: The Shampoo Optimizer

class VariableClippingOptimizer: Wrapper optimizer that clips the norm of specified variables after update.

Functions

clip_gradients_by_global_norm(...): Clips gradients of a multitask loss by their global norm.

extend_with_decoupled_weight_decay(...): Factory function returning an optimizer class with decoupled weight decay.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/opt