syna.functions.activation module

class syna.functions.activation.ReLU[source]

Bases: Function

Rectified Linear Unit.

backward(gy)[source]
forward(x)[source]
class syna.functions.activation.Sigmoid[source]

Bases: Function

Numerically stable tanh-based sigmoid.

backward(gy)[source]
forward(x)[source]
class syna.functions.activation.Softmax(axis=1)[source]

Bases: Function

Softmax with stable forward and correct backward.

backward(gy)[source]
forward(x)[source]
syna.functions.activation.relu(x) Tensor[source]

ReLU activation.

syna.functions.activation.sigmoid(x) Tensor[source]

Sigmoid activation.

syna.functions.activation.sigmoid_simple(x) Tensor[source]

Sigmoid implemented via exp; returns as Tensor.

syna.functions.activation.softmax(x, axis=1) Tensor[source]

Softmax along specified axis.

syna.functions.activation.softmax_simple(x, axis=1)[source]

Softmax using safe exp/normalization helpers.