d3rlpy.models.AdamFactory

class d3rlpy.models.AdamFactory(betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False)[source]

An alias for Adam optimizer.

from d3rlpy.optimizers import AdamFactory

factory = AdamFactory(weight_decay=1e-4)
Parameters:
  • betas (Tuple[float, float]) – coefficients used for computing running averages of gradient and its square.

  • eps (float) – term added to the denominator to improve numerical stability.

  • weight_decay (float) – weight decay (L2 penalty).

  • amsgrad (bool) – flag to use the AMSGrad variant of this algorithm.

Methods

create(named_modules, lr)[source]

Returns an optimizer object.

Parameters:
  • named_modules (list) – List of tuples of module names and modules.

  • lr (float) – Learning rate.

Returns:

an optimizer object.

Return type:

torch.optim.Optimizer

classmethod deserialize(serialized_config)
Parameters:

serialized_config (str) –

Return type:

TConfig

classmethod deserialize_from_dict(dict_config)
Parameters:

dict_config (Dict[str, Any]) –

Return type:

TConfig

classmethod deserialize_from_file(path)
Parameters:

path (str) –

Return type:

TConfig

classmethod from_dict(kvs, *, infer_missing=False)
Parameters:

kvs (Optional[Union[dict, list, str, int, float, bool]]) –

Return type:

A

classmethod from_json(s, *, parse_float=None, parse_int=None, parse_constant=None, infer_missing=False, **kw)
Parameters:

s (Union[str, bytes, bytearray]) –

Return type:

A

static get_type()[source]
Return type:

str

classmethod schema(*, infer_missing=False, only=None, exclude=(), many=False, context=None, load_only=(), dump_only=(), partial=False, unknown=None)
Parameters:
  • infer_missing (bool) –

  • many (bool) –

  • partial (bool) –

Return type:

SchemaF[A]

serialize()
Return type:

str

serialize_to_dict()
Return type:

Dict[str, Any]

to_dict(encode_json=False)
Return type:

Dict[str, Optional[Union[dict, list, str, int, float, bool]]]

to_json(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, indent=None, separators=None, default=None, sort_keys=False, **kw)
Parameters:
Return type:

str

Attributes

amsgrad: bool = False
betas: Tuple[float, float] = (0.9, 0.999)
eps: float = 1e-08
weight_decay: float = 0