This package contains a deep learning API backed by dopt.
Working examples for how this package can be used are given in the examples/mnist.d and examples/cifar10.d files.
One would generally start by using UFCS to define a feed-forward network:
1 auto features = float32([128, 1, 28, 28]); 2 3 auto layers = dataSource(features) 4 .dense(2_000) 5 .relu() 6 .dense(2_000) 7 .relu() 8 .dense(10) 9 .softmax();
The DAGNetwork class can then be used to traverse the resulting graph and aggregate parameters/loss terms:
auto network = new DAGNetwork([features], [layers]);
After this, one can define an objective function---there are a few standard loss functions implemented in dopt.nnet.losses:
auto labels = float32([128, 10]); auto trainLoss = crossEntropy(layers.trainOutput, labels) + network.paramLoss;
where network.paramLoss is the sum of any parameter regularisation terms. The dopt.online package can be used to construct an updater:
auto updater = sgd([trainLoss], network.params, network.paramProj);
Finally, one can call this updater with some actual training data:
updater([ features: Buffer(some_real_features), labels: Buffer(some_real_labels) ]);
Contains generic utilities for working with Layer objects.
Contains some utilities for constructing graphs for common loss functions.
Provides a useful tools for constructing neural networks.
This module contains methods for initialising the parameters of neural networks.