Contains an implementation of batch normalisation.
Contains an implementation of convolutional layers.
Allows one to provide input to a network via a dopt variable.
Contains an implementation of dense (i.e., fully connected) layers.
Contains an implementation of dropout.
Contains an implementation of max pooling.
Contains an implementation of the ReLU activation function.
Contains an implementation of the softmat activation function.
Encapsulates the expressions and parameter information that defines a network layer.
Contains generic utilities for working with Layer objects.