Rate Controllers¶
RateController¶
-
class
sconce.rate_controllers.
RateController
[source]¶ The base class of all rate controllers in Sconce. It is only an interface, describing what must be implemented if you want to define a rate controller.
-
new_learning_rate
(step, data)[source]¶ Called by a
Trainer
during a training/evaluation session just before the training step.Parameters: - data (dict) – the output of the training/evaluation step. The keys may include, but are not limited to: {‘training_loss’, ‘test_loss’, ‘learning_rate’}.
- step (float) – (0.0, inf) the step that was just completed.
Fractional steps are possible (see batch_multiplier option on
sconce.trainer.Trainer.train()
).
Returns: The new learning rate that should be used for the next training step. If this is a
CompositeRateController
then an OrderedDict is returned where the keys are like, {‘group 0’, ‘group 1’, ect}, and the values are the new learning rate (float) for that parameter group.Return type: new_learning_rate (float,
collections.OrderedDict
)
-
CompositeRateController¶
-
class
sconce.rate_controllers.
CompositeRateController
(rate_controllers)[source]¶ A rate controller composed of two or more rate controllers. Using this allows you to pass a single rate controller to a trainer, and control the learning rate of multiple parameter groups. The order that the controllers are added is important, and aligns to the order of the
Optimizer
’s parameter_groups.Parameters: rate_controllers (iterable of RateController
) – the rate_controllers you want to compose together.New in 0.9.0
ConstantRateController¶
-
class
sconce.rate_controllers.
ConstantRateController
(learning_rate, drop_factor=0.1, movement_key='training_loss', movement_threshold=0.25, movement_window=None, num_drops=0)[source]¶ A Learning rate that is constant. It can adjust its learning rate by <drop_factor> up to <num_drops> times based on detecting that some metric or loss has stopped moving.