grad

Computes the gradient of a scalar-valued operation with respect to several dependencies.

This function provides an implementation of automatic differentiation can be used for greatly simplifying the process of optimising objective functions. The particular technique used by the function is known as reverse mode automatic differentiation.

Parameters

objective
Type: Operation

The function being differentiated.

wrt
Type: Operation[]

The (indirect) dependencies that objective is being differentiated with respect to.

Return Value

Type: Operation[]

An array of operations that evaluate to the derivative of objective to each of the elements of wrt.

Examples

1 import std.random : uniform;
2 import dopt.core : evaluate;
3 
4 auto x = float32();
5 auto y = x * x;
6 auto gradY = grad(y, [x]);
7 
8 auto r = uniform(-100.0f, 100.0f);
9 
10 auto gradYwrtX = gradY.evaluate([
11     x: Buffer([r])
12 ])[0];
13 
14 assert(gradYwrtX.as!float[0] == r + r);

Meta