Type TowerOptimizer
Namespace tensorflow_estimator.contrib.estimator
Parent Optimizer
Interfaces ITowerOptimizer
Methods
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients
- apply_gradients_dyn
- compute_gradients
- compute_gradients
- compute_gradients
- compute_gradients_dyn
- compute_gradients_dyn
- compute_gradients_dyn
- get_name
- get_name
- get_name_dyn
- get_name_dyn
- has_been_used
- has_been_used_dyn
- NewDyn
- variables
- variables
- variables_dyn
- variables_dyn
Properties
Fields
Public instance methods
object apply_gradients(IEnumerable<object> grads_and_vars, IGraphNodeBase global_step, PythonFunctionContainer name)
Apply gradients to variables.  This is the second part of `minimize()`. It returns an `Operation` that
applies gradients. 
			
				
			
				
		
	Parameters
- 
							IEnumerable<object>grads_and_vars
- List of (gradient, variable) pairs as returned by `compute_gradients()`.
- 
							IGraphNodeBaseglobal_step
- Optional `Variable` to increment by one after the variables have been updated.
- 
							PythonFunctionContainername
- Optional name for the returned operation. Default to the name passed to the `Optimizer` constructor.
Returns
- 
						object
- An `Operation` that applies the specified gradients. If `global_step` was not None, that operation also increments `global_step`.
object apply_gradients(object grads_and_vars, int global_step, IDictionary<string, object> kwargs)
object apply_gradients(object grads_and_vars, BaseResourceVariable global_step, IDictionary<string, object> kwargs)
object apply_gradients(IEnumerable<object> grads_and_vars, int global_step, PythonFunctionContainer name)
Apply gradients to variables.  This is the second part of `minimize()`. It returns an `Operation` that
applies gradients. 
			
				
			
				
		
	Parameters
- 
							IEnumerable<object>grads_and_vars
- List of (gradient, variable) pairs as returned by `compute_gradients()`.
- 
							intglobal_step
- Optional `Variable` to increment by one after the variables have been updated.
- 
							PythonFunctionContainername
- Optional name for the returned operation. Default to the name passed to the `Optimizer` constructor.
Returns
- 
						object
- An `Operation` that applies the specified gradients. If `global_step` was not None, that operation also increments `global_step`.
object apply_gradients(ValueTuple<IEnumerable<object>, object> grads_and_vars, IGraphNodeBase global_step, IDictionary<string, object> kwargs)
object apply_gradients(ValueTuple<IEnumerable<object>, object> grads_and_vars, BaseResourceVariable global_step, IDictionary<string, object> kwargs)
object apply_gradients(ValueTuple<IEnumerable<object>, object> grads_and_vars, int global_step, IDictionary<string, object> kwargs)
object apply_gradients(IEnumerable<object> grads_and_vars, int global_step, IDictionary<string, object> kwargs)
object apply_gradients(object grads_and_vars, IGraphNodeBase global_step, IDictionary<string, object> kwargs)
object apply_gradients(IEnumerable<object> grads_and_vars, BaseResourceVariable global_step, IDictionary<string, object> kwargs)
object apply_gradients(IEnumerable<object> grads_and_vars, BaseResourceVariable global_step, PythonFunctionContainer name)
Apply gradients to variables.  This is the second part of `minimize()`. It returns an `Operation` that
applies gradients. 
			
				
			
				
		
	Parameters
- 
							IEnumerable<object>grads_and_vars
- List of (gradient, variable) pairs as returned by `compute_gradients()`.
- 
							BaseResourceVariableglobal_step
- Optional `Variable` to increment by one after the variables have been updated.
- 
							PythonFunctionContainername
- Optional name for the returned operation. Default to the name passed to the `Optimizer` constructor.
Returns
- 
						object
- An `Operation` that applies the specified gradients. If `global_step` was not None, that operation also increments `global_step`.
object apply_gradients(IEnumerable<object> grads_and_vars, IGraphNodeBase global_step, IDictionary<string, object> kwargs)
object apply_gradients_dyn(object grads_and_vars, object global_step, IDictionary<string, object> kwargs)
object compute_gradients(PythonFunctionContainer loss, Object[] args)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							PythonFunctionContainerloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							Object[]args
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
object compute_gradients(object loss, IEnumerable<object> var_list, ImplicitContainer<T> gate_gradients, object aggregation_method, bool colocate_gradients_with_ops, IGraphNodeBase grad_loss)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							objectloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							IEnumerable<object>var_list
- Optional list or tuple of tf.Variableto update to minimize `loss`. Defaults to the list of variables collected in the graph under the key `GraphKeys.TRAINABLE_VARIABLES`.
- 
							ImplicitContainer<T>gate_gradients
- How to gate the computation of gradients. Can be `GATE_NONE`, `GATE_OP`, or `GATE_GRAPH`.
- 
							objectaggregation_method
- Specifies the method used to combine gradient terms. Valid values are defined in the class `AggregationMethod`.
- 
							boolcolocate_gradients_with_ops
- If True, try colocating gradients with the corresponding op.
- 
							IGraphNodeBasegrad_loss
- Optional. A `Tensor` holding the gradient computed for `loss`.
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
object compute_gradients(PythonFunctionContainer loss, IDictionary<string, object> kwargs, Object[] args)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							PythonFunctionContainerloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							IDictionary<string, object>kwargs
- 
							Object[]args
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
object compute_gradients_dyn(object loss, object var_list, ImplicitContainer<T> gate_gradients, object aggregation_method, ImplicitContainer<T> colocate_gradients_with_ops, object grad_loss)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							objectloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							objectvar_list
- Optional list or tuple of tf.Variableto update to minimize `loss`. Defaults to the list of variables collected in the graph under the key `GraphKeys.TRAINABLE_VARIABLES`.
- 
							ImplicitContainer<T>gate_gradients
- How to gate the computation of gradients. Can be `GATE_NONE`, `GATE_OP`, or `GATE_GRAPH`.
- 
							objectaggregation_method
- Specifies the method used to combine gradient terms. Valid values are defined in the class `AggregationMethod`.
- 
							ImplicitContainer<T>colocate_gradients_with_ops
- If True, try colocating gradients with the corresponding op.
- 
							objectgrad_loss
- Optional. A `Tensor` holding the gradient computed for `loss`.
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
object compute_gradients_dyn(object loss, Object[] args)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							objectloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							Object[]args
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
object compute_gradients_dyn(object loss, IDictionary<string, object> kwargs, Object[] args)
Compute gradients of `loss` for the variables in `var_list`.  This is the first part of `minimize()`.  It returns a list
of (gradient, variable) pairs where "gradient" is the gradient
for "variable".  Note that "gradient" can be a `Tensor`, an
`IndexedSlices`, or `None` if there is no gradient for the
given variable. 
			
				
			
				
		
	Parameters
- 
							objectloss
- A Tensor containing the value to minimize or a callable taking no arguments which returns the value to minimize. When eager execution is enabled it must be a callable.
- 
							IDictionary<string, object>kwargs
- 
							Object[]args
Returns
- 
						object
- A list of (gradient, variable) pairs. Variable is always present, but gradient can be `None`.
string get_name(IDictionary<string, object> kwargs, Object[] args)
string get_name(Object[] args)
object get_name_dyn(Object[] args)
object get_name_dyn(IDictionary<string, object> kwargs, Object[] args)
object variables(Object[] args)
object variables(IDictionary<string, object> kwargs, Object[] args)
object variables_dyn(IDictionary<string, object> kwargs, Object[] args)
object variables_dyn(Object[] args)
Public static methods
bool has_been_used()
object has_been_used_dyn()
TowerOptimizer NewDyn(object optimizer_or_optimizer_fn)
Creates a `ClusterSpec`. 
		
	Public properties
object COLLECTION_FOR_GRAPH_STATES_dyn get; set;
object PythonObject get;
Public fields
string COLLECTION_FOR_GRAPH_STATES
| return string |