syna.layers.rnn module¶
- class syna.layers.rnn.LSTM(hidden_size: int, in_size: int | None = None)[source]¶
Bases:
Layer
LSTM layer implementation.
Uses separate Linear layers for input-to-gate and hidden-to-gate transforms. Hidden-to-gate transforms have no bias.