- convolution
Operation convolution(Operation features, Operation filters, size_t[] padding = [0, 0], size_t[] stride = [1, 1], string mod = __MODULE__, size_t line = __LINE__)
Creates a convolution operation that performs the computation required to implement a convolutional layer.
- convolutionFeaturesGrad
Operation convolutionFeaturesGrad(Operation parentGrad, Operation filters, size_t[] featuresShape, size_t[] padding, size_t[] stride, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the derivative of a convolution operation with respect to the feature maps.
- convolutionFiltersGrad
Operation convolutionFiltersGrad(Operation parentGrad, Operation features, size_t[] filtersShape, size_t[] padding, size_t[] stride, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the derivative of a convolution operation with respect to the filters.
- convolutionTranspose
Operation convolutionTranspose(Operation features, Operation filters, size_t[] padding = [0, 0], size_t[] stride = [1, 1], string mod = __MODULE__, size_t line = __LINE__)
Creates a transposed convolution operation (also known, incorrectly, as deconvolution).
- maxpool
Operation maxpool(Operation features, size_t[] dims, string mod = __MODULE__, size_t line = __LINE__)
Creates a max pool operation that performs the computation required to implement a max pooling layer.
- maxpoolGrad
Operation maxpoolGrad(Operation parentGrad, Operation op, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the derivative of a maxpool operation with respect to the feature maps.
- relu
Operation relu(Operation inputs, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the computation required for a ReLU layer.
- softmax
Operation softmax(Operation inputs, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the computation required for a softmax layer.
- softmaxGrad
Operation softmaxGrad(Operation parentGrad, Operation op, string mod = __MODULE__, size_t line = __LINE__)
Creates an operation representing the gradient of the softmax function.
Contains common neural network operations.
These operations are currently only implemented for the CUDA backend.