d3rlpy.optimizers.AdamFactory¶
-
class
d3rlpy.optimizers.
AdamFactory
(betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False, **kwargs)[source]¶ An alias for Adam optimizer.
from d3rlpy.optimizers import AdamFactory factory = AdamFactory(weight_decay=1e-4)
Parameters: Methods
-
create
(params, lr)¶ Returns an optimizer object.
Parameters: Returns: an optimizer object.
Return type: torch.optim.Optimizer
-