syna.optim.adadelta module

class syna.optim.adadelta.AdaDelta(rho: float = 0.95, eps: float = 1e-06)[source]

Bases: Optimizer

AdaDelta optimizer without a global learning rate.

update_one(param) None[source]

Update a single parameter. Must be implemented by subclasses.