RLPack
 
Loading...
Searching...
No Matches
Optimizers

RLPack provides set of optimizers directly from PyTorch to be used to train the models in our agents. Just like everything in RLPack, optimizers are accessible via keywords too. Optimizer can be set via optimizer_name: <keyword> methodology as usual. For example optimizer_name: "adam" will initialize the Adam Optimizer from PyTorch. Further arguments is passed via the key optimizer_args which is a dictionary of keyword arguments for the optimizer (except model parameters). For keyword arguments of an optimizer, you can refer to PyTorch's official documentation corresponding that optimizer.

Currently, the following optimizers have been implemented in RLPack, i.e. they can be used with keywords.

Optimizers Description Keyword
Adam The Adam Optimizer. Mandatory arguments can be looked up here. To further understand Adam Optimization algorithm, you can refer here. "adam"
AdamW The AdamW Optimizer. Mandatory arguments can be looked up here. To further understand Adam Optimization algorithm, you can refer here "adamw"
RMSProp The Root Mean Squared Propagation optimizer. Mandatory arguments can be looked up here. For further details, lecture notes by G. Hinton can be referred. "rmsprop"
SGD The Stochastic Gradient Descend optimizer. The mandatory arguments for SGD Optimizer can be looked up here. For further understanding of SGD algorithm, refer here. "sgd"