LostTech.TensorFlow : API Documentation

Type AdagradOptimizer

Namespace tensorflow.train

Parent Optimizer

Interfaces IAdagradOptimizer

Optimizer that implements the Adagrad algorithm.

See this [paper](http://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf) or this [intro](https://ppasupat.github.io/a9online/uploads/proximal_notes.pdf).



Public static methods

AdagradOptimizer NewDyn(object learning_rate, ImplicitContainer<T> initial_accumulator_value, ImplicitContainer<T> use_locking, ImplicitContainer<T> name)

Construct a new Adagrad optimizer.
object learning_rate
A `Tensor` or a floating point value. The learning rate.
ImplicitContainer<T> initial_accumulator_value
A floating point value. Starting value for the accumulators, must be positive.
ImplicitContainer<T> use_locking
If `True` use locks for update operations.
ImplicitContainer<T> name
Optional name prefix for the operations created when applying gradients. Defaults to "Adagrad".

Public properties

object PythonObject get;