Type MomentumOptimizer
Namespace tensorflow.train
Parent Optimizer
Interfaces IMomentumOptimizer
Optimizer that implements the Momentum algorithm. Computes (if `use_nesterov = False`): ```
accumulation = momentum * accumulation + gradient
variable -= learning_rate * accumulation
``` Note that in the dense version of this algorithm, `accumulation` is updated
and applied regardless of a gradient's value, whereas the sparse version (when
the gradient is an `IndexedSlices`, typically because of
tf.gather
or an
embedding) only updates variable slices and corresponding `accumulation` terms
when that part of the variable was used in the forward pass.