d3rlpy.models.optimizers.AdamFactory¶
-
class
d3rlpy.models.optimizers.
AdamFactory
(betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False, **kwargs)[source]¶ An alias for Adam optimizer.
from d3rlpy.optimizers import AdamFactory factory = AdamFactory(weight_decay=1e-4)
- Parameters
betas – coefficients used for computing running averages of gradient and its square.
eps – term added to the denominator to improve numerical stability.
weight_decay – weight decay (L2 penalty).
amsgrad – flag to use the AMSGrad variant of this algorithm.
Methods
-
create
(params, lr)¶ Returns an optimizer object.