dopt.nnet.layers

Contains generic utilities for working with Layer objects.

Modules

batchnorm
module dopt.nnet.layers.batchnorm

Contains an implementation of batch normalisation.

conv
module dopt.nnet.layers.conv

Contains an implementation of convolutional layers.

datasource
module dopt.nnet.layers.datasource

Allows one to provide input to a network via a dopt variable.

dense
module dopt.nnet.layers.dense

Contains an implementation of dense (i.e., fully connected) layers.

dropout
module dopt.nnet.layers.dropout

Contains an implementation of dropout.

maxpool
module dopt.nnet.layers.maxpool

Contains an implementation of max pooling.

relu
module dopt.nnet.layers.relu

Contains an implementation of the ReLU activation function.

softmax
module dopt.nnet.layers.softmax

Contains an implementation of the softmat activation function.

Public Imports

dopt.nnet.layers.batchnorm
public import dopt.nnet.layers.batchnorm;
Undocumented in source.
dopt.nnet.layers.conv
public import dopt.nnet.layers.conv;
Undocumented in source.
dopt.nnet.layers.datasource
public import dopt.nnet.layers.datasource;
Undocumented in source.
dopt.nnet.layers.dense
public import dopt.nnet.layers.dense;
Undocumented in source.
dopt.nnet.layers.dropout
public import dopt.nnet.layers.dropout;
Undocumented in source.
dopt.nnet.layers.maxpool
public import dopt.nnet.layers.maxpool;
Undocumented in source.
dopt.nnet.layers.relu
public import dopt.nnet.layers.relu;
Undocumented in source.
dopt.nnet.layers.softmax
public import dopt.nnet.layers.softmax;
Undocumented in source.

Members

Classes

Layer
class Layer

Encapsulates the expressions and parameter information that defines a network layer.

Functions

topologicalSort
Layer[] topologicalSort(Layer[] ops)
Undocumented in source. Be warned that the author may not have intended to support it.

Meta

Authors

Henry Gouk