Contains generic utilities for working with Layer objects.
Contains an implementation of the regularisation techniques presented in Gouk et al. (2018).
Contains some utilities for constructing graphs for common loss functions.
Provides a useful tools for constructing neural networks.
This module contains methods for initialising the parameters of neural networks.
This package contains a deep learning API backed by dopt.
Working examples for how this package can be used are given in the examples/mnist.d and examples/cifar10.d files.
One would generally start by using UFCS to define a feed-forward network:
The DAGNetwork class can then be used to traverse the resulting graph and aggregate parameters/loss terms:
After this, one can define an objective function---there are a few standard loss functions implemented in dopt.nnet.losses:
where network.paramLoss is the sum of any parameter regularisation terms. The dopt.online package can be used to construct an updater:
Finally, one can call this updater with some actual training data: