syna.optim.adagrad module¶ class syna.optim.adagrad.AdaGrad(lr: float = 0.001, eps: float = 1e-08)[source]¶ Bases: Optimizer AdaGrad adaptive learning rate. update_one(param) → None[source]¶ Update a single parameter. Must be implemented by subclasses.