syna.optim.optimizer module¶
Optimization algorithms and helper utilities.
Includes common optimizers (SGD, Adam, AdaGrad, etc.) and small utilities that are used as hooks (weight decay, gradient clipping, parameter freezing).
- class syna.optim.optimizer.ClipGrad(max_norm: float)[source]¶
Bases:
object
Clip gradients by global norm.
max_norm: maximum allowed norm for concatenated gradients.
- class syna.optim.optimizer.FreezeParam(*layers)[source]¶
Bases:
object
Freeze specified parameters or layers (set their grads to None).
- class syna.optim.optimizer.Optimizer[source]¶
Bases:
object
Base optimizer.
Subclasses should implement update_one(param). The optimizer keeps a reference to the target (model) via setup(target) and supports hooks that run before updates.