LostTech.TensorFlow : API Documentation

Type ConditionalAccumulator

Namespace tensorflow

Parent ConditionalAccumulatorBase

Interfaces IConditionalAccumulator

A conditional accumulator for aggregating gradients.

Up-to-date gradients (i.e., time step at which gradient was computed is equal to the accumulator's time step) are added to the accumulator.

Extraction of the average gradient is blocked until the required number of gradients has been accumulated.

Methods

Properties

Public instance methods

object apply_grad(object grad, ImplicitContainer<T> local_step, ndarray name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
ndarray name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

object apply_grad(object grad, ImplicitContainer<T> local_step, IDictionary<object, object> name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
IDictionary<object, object> name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

object apply_grad(object grad, ImplicitContainer<T> local_step, IEnumerable<int> name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
IEnumerable<int> name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

object apply_grad(object grad, ImplicitContainer<T> local_step, ValueTuple<int, object> name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
ValueTuple<int, object> name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

object apply_grad(object grad, ImplicitContainer<T> local_step, IGraphNodeBase name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
IGraphNodeBase name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

object apply_grad_dyn(object grad, ImplicitContainer<T> local_step, object name)

Attempts to apply a gradient to the accumulator.

The attempt is silently dropped if the gradient is stale, i.e., local_step is less than the accumulator's global time step.
Parameters
object grad
The gradient tensor to be applied.
ImplicitContainer<T> local_step
Time step at which the gradient was computed.
object name
Optional name for the operation.
Returns
object
The operation that (conditionally) applies a gradient to the accumulator.

Tensor take_grad(int num_required, string name)

Attempts to extract the average gradient from the accumulator.

The operation blocks until sufficient number of gradients have been successfully applied to the accumulator.

Once successful, the following actions are also triggered:

- Counter of accumulated gradients is reset to 0. - Aggregated gradient is reset to 0 tensor. - Accumulator's internal time step is incremented by 1.
Parameters
int num_required
Number of gradients that needs to have been aggregated
string name
Optional name for the operation
Returns
Tensor
A tensor holding the value of the average gradient.

Tensor take_grad(IGraphNodeBase num_required, string name)

Attempts to extract the average gradient from the accumulator.

The operation blocks until sufficient number of gradients have been successfully applied to the accumulator.

Once successful, the following actions are also triggered:

- Counter of accumulated gradients is reset to 0. - Aggregated gradient is reset to 0 tensor. - Accumulator's internal time step is incremented by 1.
Parameters
IGraphNodeBase num_required
Number of gradients that needs to have been aggregated
string name
Optional name for the operation
Returns
Tensor
A tensor holding the value of the average gradient.

object take_grad_dyn(object num_required, object name)

Attempts to extract the average gradient from the accumulator.

The operation blocks until sufficient number of gradients have been successfully applied to the accumulator.

Once successful, the following actions are also triggered:

- Counter of accumulated gradients is reset to 0. - Aggregated gradient is reset to 0 tensor. - Accumulator's internal time step is incremented by 1.
Parameters
object num_required
Number of gradients that needs to have been aggregated
object name
Optional name for the operation
Returns
object
A tensor holding the value of the average gradient.

Public properties

object accumulator_ref get;

object accumulator_ref_dyn get;

object dtype get;

object dtype_dyn get;

string name get;

object name_dyn get;

object PythonObject get;