Type ExponentialUpdateLossScaleManager
Namespace tensorflow.contrib.mixed_precision
Parent PythonObjectContainer
Interfaces LossScaleManager, IExponentialUpdateLossScaleManager
Loss scale manager uses an exponential update strategy. In general, the strategy increases loss scale by a greater-than-one factor
after encountering a consecutive series of steps with finite gradients;
Similarly, it decreases the loss scale by a factor when the accumulated number
of steps with non-finite (nan or inf) gradients are met. An update is not
applied if its result is less than 1 or overflows the float32 dynamic range. The number of finite and non-finite steps are cleared every time the loss
scale is changed. The condition to decrease the loss scale is looser than to
increase it since the former does not require the steps to be consecutive.