Contains an implementation of ADAM that relies on automatic differentiation
Contains an implementation of AMSGrad that relies on automatic differentiation
Contains an implementation of stochastic gradient descent that relies on automatic differentiation
Used for performing projected gradient descent.
A delegate that can be used to perform the update step for an online optimisation algorithm.
This package contains implementations of common online optimisation algorithms, with a particular bias towards those commonly used in large scale machine learning/deep learning.