Type BaselineEstimator
Namespace tensorflow_estimator.python.estimator.api.estimator
Parent Estimator
Interfaces IBaselineEstimator
Methods
Properties
Public static methods
BaselineEstimator NewDyn(object head, object model_dir, ImplicitContainer<T> optimizer, object config)
Applies cosine decay to the learning rate. See [Loshchilov & Hutter, ICLR2016], SGDR: Stochastic Gradient Descent
with Warm Restarts. https://arxiv.org/abs/1608.03983 When training a model, it is often recommended to lower the learning rate as
the training progresses. This schedule applies a cosine decay function
to an optimizer step, given a provided initial learning rate.
It requires a `step` value to compute the decayed learning rate. You can
just pass a TensorFlow variable that you increment at each training step. The schedule a 1-arg callable that produces a decayed learning
rate when passed the current optimizer step. This can be useful for changing
the learning rate value across different invocations of optimizer functions.
It is computed as:
Example usage:
You can pass this schedule directly into a
tf.keras.optimizers.Optimizer
as the learning rate. The learning rate schedule is also serializable and
deserializable using tf.keras.optimizers.schedules.serialize
and
tf.keras.optimizers.schedules.deserialize
.
Returns
-
BaselineEstimator
- A 1-arg callable learning rate schedule that takes the current optimizer step and outputs the decayed learning rate, a scalar `Tensor` of the same type as `initial_learning_rate`.
Show Example
def decayed_learning_rate(step): step = min(step, decay_steps) cosine_decay = 0.5 * (1 + cos(pi * step / decay_steps)) decayed = (1 - alpha) * cosine_decay + alpha return initial_learning_rate * decayed