syna.optim.adam module

class syna.optim.adam.Adam(alpha: float = 0.001, beta1: float = 0.9, beta2: float = 0.999, eps: float = 1e-08)[source]

Bases: Optimizer

Adam optimizer with bias-correction handled via dynamic lr property.

alpha: base step size beta1, beta2: exponential decay rates for first and second moment estimates

property lr: float

Compute bias-corrected learning rate factor.

update(*args, **kwargs) None[source]

Increment time step and perform parameter updates.

update_one(param) None[source]

Update a single parameter. Must be implemented by subclasses.