syna.layers.layer module

Neural network layer definitions.

Defines the Layer base class and common layer implementations such as Linear, RNN and LSTM. Layers manage parameters and support saving/loading weights.

class syna.layers.layer.Layer[source]

Bases: object

Base layer class that tracks Parameters and sub-Layers.

Subclasses must implement forward(). Layers register any attribute that is a Parameter or Layer automatically.

cleargrads()[source]

Clear gradients of all parameters.

forward(inputs)[source]

Compute the forward pass. Must be implemented by subclasses.

load_weights(path: str)[source]

Load parameters from a .npz file created by save_weights().

params()[source]

Yield all Parameter objects in this layer (including nested layers).

save_weights(path: str)[source]

Save layer parameters to a compressed .npz file.

class syna.layers.layer.Linear(out_size: int, nobias: bool = False, dtype=<class 'numpy.float32'>, in_size: int | None = None)[source]

Bases: Layer

Fully-connected linear layer.

Args:

out_size: output dimension nobias: if True, no bias is used dtype: numpy dtype for parameters in_size: optional input dimension; if not provided, W is initialized

lazily on the first forward pass based on input shape.

forward(inputs)[source]

Apply linear transformation to inputs. Initializes W lazily if needed.