LostTech.TensorFlow : API Documentation

Type ProximalAdagradOptimizer

Namespace tensorflow.train

Parent Optimizer

Interfaces IProximalAdagradOptimizer

Optimizer that implements the Proximal Adagrad algorithm.

See this [paper](http://papers.nips.cc/paper/3793-efficient-learning-using-forward-backward-splitting.pdf).

Methods

Properties

Public static methods

ProximalAdagradOptimizer NewDyn(object learning_rate, ImplicitContainer<T> initial_accumulator_value, ImplicitContainer<T> l1_regularization_strength, ImplicitContainer<T> l2_regularization_strength, ImplicitContainer<T> use_locking, ImplicitContainer<T> name)

Construct a new ProximalAdagrad optimizer.
Parameters
object learning_rate
A `Tensor` or a floating point value. The learning rate.
ImplicitContainer<T> initial_accumulator_value
A floating point value. Starting value for the accumulators, must be positive.
ImplicitContainer<T> l1_regularization_strength
A float value, must be greater than or equal to zero.
ImplicitContainer<T> l2_regularization_strength
A float value, must be greater than or equal to zero.
ImplicitContainer<T> use_locking
If `True` use locks for update operations.
ImplicitContainer<T> name
Optional name prefix for the operations created when applying gradients. Defaults to "Adagrad".

Public properties

object PythonObject get;