d3rlpy.optimizers.AdamFactory

class d3rlpy.optimizers.AdamFactory(betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False, **kwargs)[source]

An alias for Adam optimizer.

from d3rlpy.optimizers import AdamFactory

factory = AdamFactory(weight_decay=1e-4)
Parameters:
  • betas (tuple) – coefficients used for computing running averages of gradient and its square.
  • eps (float) – term added to the denominator to improve numerical stability.
  • weight_decay (float) – weight decay (L2 penalty).
  • amsgrad (bool) – flag to use the AMSGrad variant of this algorithm.
optim_cls

torch.optim.Adam class.

Type:type
optim_kwargs

given parameters for an optimizer.

Type:dict

Methods

create(params, lr)

Returns an optimizer object.

Parameters:
  • params (list) – a list of PyTorch parameters.
  • lr (float) – learning rate.
Returns:

an optimizer object.

Return type:

torch.optim.Optimizer

get_params(deep=False)

Returns optimizer parameters.

Parameters:deep (bool) – flag to deeply copy the parameters.
Returns:optimizer parameters.
Return type:dict