sconce.rate_controllers package¶
sconce.rate_controllers.base module¶
-
class
sconce.rate_controllers.base.
CompositeRateController
(rate_controllers)[source]¶ Bases:
sconce.rate_controllers.base.RateController
A rate controller composed of two or more rate controllers. Using this allows you to pass a single rate controller to a trainer, and control the learning rate of multiple parameter groups. The order that the controllers are added is important, and aligns to the order of the
Optimizer
’s parameter_groups.Parameters: rate_controllers (iterable of RateController
) – the rate_controllers you want to compose together.New in 0.9.0
-
class
sconce.rate_controllers.base.
RateController
[source]¶ Bases:
abc.ABC
The base class of all rate controllers in Sconce. It is only an interface, describing what must be implemented if you want to define a rate controller.
-
new_learning_rate
(step, data)[source]¶ Called by a
Trainer
during a training/evaluation session just before the training step.Parameters: - data (dict) – the output of the training/evaluation step. The keys may include, but are not limited to: {‘training_loss’, ‘test_loss’, ‘learning_rate’}.
- step (float) – (0.0, inf) the step that was just completed.
Fractional steps are possible (see batch_multiplier option on
sconce.trainer.Trainer.train()
).
Returns: The new learning rate that should be used for the next training step. If this is a
CompositeRateController
then an OrderedDict is returned where the keys are like, {‘group 0’, ‘group 1’, ect}, and the values are the new learning rate (float) for that parameter group.Return type: new_learning_rate (float,
collections.OrderedDict
)
-
sconce.rate_controllers.constant_rate_controller module¶
-
class
sconce.rate_controllers.constant_rate_controller.
ConstantRateController
(learning_rate, drop_factor=0.1, movement_key='training_loss', movement_threshold=0.25, movement_window=None, num_drops=0)[source]¶ Bases:
sconce.rate_controllers.base.RateController
A Learning rate that is constant. It can adjust its learning rate by <drop_factor> up to <num_drops> times based on detecting that some metric or loss has stopped moving.
sconce.rate_controllers.cosine_rate_controller module¶
-
class
sconce.rate_controllers.cosine_rate_controller.
CosineRateController
(max_learning_rate, min_learning_rate=None)[source]¶ Bases:
sconce.rate_controllers.base.RateController
A learning rate that follows a scaled and shifted cosine function from [0, pi/2]. It will begin at <max_learning_rate> and end at <min_learning_rate>, after <num_steps>.
sconce.rate_controllers.exponential_rate_controller module¶
-
class
sconce.rate_controllers.exponential_rate_controller.
ExponentialRateController
(min_learning_rate, max_learning_rate, stop_factor=None, loss_key='training_loss')[source]¶ Bases:
sconce.rate_controllers.base.RateController
A Learning rate that rises exponentially from <min_learning_rate> to <max_learning_rate>, over <num_steps>.
sconce.rate_controllers.linear_rate_controller module¶
-
class
sconce.rate_controllers.linear_rate_controller.
LinearRateController
(min_learning_rate, max_learning_rate, stop_factor=None, loss_key='training_loss')[source]¶ Bases:
sconce.rate_controllers.base.RateController
A Learning rate that rises linearly from <min_learning_rate> to <max_learning_rate>, over <num_steps>.
sconce.rate_controllers.step_rate_controller module¶
-
class
sconce.rate_controllers.step_rate_controller.
StepRateController
(max_learning_rate, min_learning_rate, num_drops=1)[source]¶ Bases:
sconce.rate_controllers.base.RateController
A Learning Rate that falls in <num_drops> drops from <max_learning_rate> to <min_learning_rate> over the course of <num_steps>. The Learning Rate is constant between drops.
sconce.rate_controllers.triangle_rate_controller module¶
-
class
sconce.rate_controllers.triangle_rate_controller.
TriangleRateController
(min_learning_rate, max_learning_rate)[source]¶ Bases:
sconce.rate_controllers.base.RateController
A Learning Rate that rises linearly from <min_learning_rate> to <max_learning_rate>, over <num_steps>/2 then drops linearly back to <min_learning_rate> over the remaining <num_steps>/2.