dopt ~master (2018-02-24T12:31:40Z)
Dub
Repo
dopt.nnet.layers.relu
dopt
nnet
layers
Contains an implementation of the ReLU activation function.
Members
Functions
relu
Layer
relu
(
Layer
input
)
Meta
Source
See Source File
Authors
Henry Gouk
dopt
nnet
layers
modules
batchnorm
conv
datasource
dense
dropout
maxpool
relu
softmax
classes
Layer
Contains an implementation of the ReLU activation function.