dopt.online

This package contains implementations of common online optimisation algorithms, with a particular bias towards those commonly used in large scale machine learning/deep learning.

Modules

adam
module dopt.online.adam

Contains an implementation of ADAM that relies on automatic differentiation

amsgrad
module dopt.online.amsgrad

Contains an implementation of AMSGrad that relies on automatic differentiation

sgd
module dopt.online.sgd

Contains an implementation of stochastic gradient descent that relies on automatic differentiation

Members

Aliases

Projection
alias Projection = Operation delegate(Operation)

Used for performing projected gradient descent.

Updater
alias Updater = Buffer[] delegate(Buffer[Operation] args)

A delegate that can be used to perform the update step for an online optimisation algorithm.

Meta

Authors

Henry Gouk