grad

Computes the gradient of a scalar-valued operation with respect to several dependencies.

This function provides an implementation of automatic differentiation can be used for greatly simplifying the process of optimising objective functions. The particular technique used by the function is known as reverse mode automatic differentiation.

Parameters

objective Operation

The function being differentiated.

wrt Operation[]

The (indirect) dependencies that objective is being differentiated with respect to.

Return Value

Type: Operation[]

An array of operations that evaluate to the derivative of objective to each of the elements of wrt.

Examples

import std.random : uniform;
import dopt.core : evaluate, buffer;

auto x = float32();
auto y = x * x;
auto gradY = grad(y, [x]);

auto r = uniform(-100.0f, 100.0f);

auto gradYwrtX = gradY.evaluate([
    x: buffer([r])
])[0];

assert(gradYwrtX.get!float[0] == r + r);

Meta