syna.layers.rnn module

class syna.layers.rnn.LSTM(hidden_size: int, in_size: int | None = None)[source]

Bases: Layer

LSTM layer implementation.

Uses separate Linear layers for input-to-gate and hidden-to-gate transforms. Hidden-to-gate transforms have no bias.

forward(x)[source]

Compute one step of LSTM.

Returns the new hidden state h_t.

reset_state()[source]

Reset hidden and cell states.

class syna.layers.rnn.RNN(hidden_size: int, in_size: int | None = None)[source]

Bases: Layer

Simple recurrent layer with tanh activation.

The recurrence is:

h_t = tanh(x2h(x_t) + h2h(h_{t-1})) (h2h has no bias)

forward(x)[source]

Process one timestep (or a batch) and return new hidden state.

reset_state()[source]

Reset the hidden state between sequences.