Brief example

The quick example below describes the running of a neural network with one
hidden layer to calculate its squared error with respect to target targ,
which is parameterized by two weight matrices and two bias vectors.
Vector/matrix types are from the hmatrix package.

Let’s make a data type to store our parameters, with convenient accessors using
lens:

A hybrid approach that manually provides gradients for individual layers
but uses automatic differentiation for chaining the layers together.

We can see that simply running the network and functions (using evalBP)
incurs virtually zero overhead. This means that library authors could actually
export only backprop-lifted functions, and users would be able to use them
without losing any performance.

Note that the manual and hybrid modes almost overlap in the range of their
random variances.

Comparisons

backprop can be compared and contrasted to many other similar libraries with
some overlap:

The ad library (and variants like diffhask) support automatic
differentiation, but only for homogeneous/monomorphic situations. All
values in a computation must be of the same type — so, your computation
might be the manipulation of Doubles through a Double -> Double
function.

backprop allows you to mix matrices, vectors, doubles, integers, and even
key-value maps as a part of your computation, and they will all be
backpropagated properly with the help of the Backprop typeclass.

The autograd library is a very close equivalent to backprop,
implemented in Python for Python applications. The difference between
backprop and autograd is mostly the difference between Haskell and
Python — static types with type inference, purity, etc.

There is a link between backprop and deep learning/neural network
libraries like tensorflow, caffe, and theano, which all
all support some form of heterogeneous automatic differentiation. Haskell
libraries doing similar things include grenade.

These are all frameworks for working with neural networks or other
gradient-based optimizations — they include things like built-in
optimizers, methods to automate training data, built-in models to use out
of the box. backprop could be used as a part of such a framework, like
I described in my A Purely Functional Typed Approach to Trainable
Models blog series; however, the backprop library itself does
not provide any built in models or optimizers or automated data processing
pipelines.

Version 0.2.5.0

Since type-combinators has been unmaintained for over two years, and is
no longer compatible with modern GHC, the library internals was rewritten
to be built on the type-level combinators in the vinyl library instead.
The main external API change is basically Every is replaced with
AllConstrained, and Known Length is replaced with RecApplicative.

To most users, this should make no difference API-wise. The only users
affected should be those using the “-N” family of functions (backpropN),
who have to pass in heterogeneous lists. Heterogeneous lists now must be
passed in using vinyl syntax and operators instead of the previous
type-combinators interface.

bpOp added, to allow for non-rank-N storage of backpropagatable
functions in containers without impredicative types.

Version 0.2.4.0

NOTE Major breaking changes to Explicit modules, and some re-shuffling of
typeclass constraints on various non-explicit functions that should only affect
polymorphic usage.

Huge improvements in performance! Around 20-40% reduction in
runtimes/overheads, with savings higher for large matrix situations or
situations with expensive add.

However, this restructuring required major reshuffling of constraints on
Backprop/Num for most functions. These are potentially breaking
changes for polymorphic code, but monomorphic code should remain
unchanged. However, code using the Explicit interfaces is most likely
broken unfortunately. Fixes just include adding or dropping zeroFuncs to
the appropriate functions.

Added warnings to Explicit modules that the API is “semi-stable”.

overVar and %~~, for modifying fields. Essentially a wrapper over a
viewVar and setVar.

Argument order in the backpropWith family of functions changed again;
breaking change for those using any backpropWith function. However,
the new order is much more usable.

Changes to the argument order in the backprop family of functions in the
Explicit interfaces now reverted back to previous order, from v0.2.0 and
before. Should be an “un-breaking” change, but will break code written in
v0.2.3 style.

Bechmarks now include HKD access and a “hybrid” approach. Documentation
updated to reflect results.

Documentation updated to include a new “unital” law for one, namely one = gradBP id.

Fixity declarations for ^^?, ^^?!, and <$>.

Added fmap . const and <$ to Prelude modules.

Backprop instances for Expr from simple-reflect

Added zeroVecNum and oneVecNum to Numeric.Backprop.Class, which is
potentially more efficient than zeroVec and oneVec if the items are
instances of Num and the vectors are larger. Also added NumVec newtype
wrapper giving Backprop instances to vectors using zeroVecNum and
oneVecNum instead of zeroVec and oneVec.

Version 0.2.3.0

Argument order in backpropWith family of functions switched around to
allow for final gradient to be given after-the-fact. Breaking change
for anyone using any backpropWith function.

As a consequence of the previous change, backprop family of functions in
Explicit interfaces also all changed argument order. Breaking change
only for those using the Explicit interfaces.

Explicit collectVar no longer needs a ZeroFunc for the container, and
so all versions of collectVar and functions that use it (fmap,
liftA2, liftA3, traverse, mapAccumL, mapAccumR) no longer require
Backprop or Num instances for the final returned container type. This
enables a lot more flexibility in container types. Breaking change
only for those using the Explicit interfaces.

BV pattern synonym added to Numeric.Backprop, abstracting over
application of splitBV and joinBV.

Version 0.2.0.0

Added Backprop class in Numeric.Backprop.Class, which is a typeclass
specifically for “backpropagatable” values. This will replace Num.

API of Numeric.Backprop completely re-written to require values be
instances of Backprop instead of Num. This closes some outstanding
issues with the reliance of Num, and allows backpropagation to work with
non-Num instances like variable-length vectors, matrices, lists, tuples,
etc. (including types from accelerate)

Numeric.Backprop.Num and Prelude.Backprop.Num modules added, providing
the old interface that uses Num instances instead of Backprop
instances, for those who wish to avoid writing orphan instances when
working with external types.

Numeric.Backprop.Explicit and Prelude.Backprop.Explicit modules added,
providing an interface that allows users to manually specify how zeroing,
addition, and one-ing works on a per-value basis. Useful for those who
wish to avoid writing orphan instances of Backprop for types with no
Num instances, or if you are mixing and matching styles.

backpropWith variants added, allowing you to specify a “final gradient”,
instead of assuming it to be 1.

Added auto, a shorter alias for constVar inspired by the ad library.

Numeric.Backprop.Tuple module removed. I couldn’t find a significant
reason to keep it now that Num is no longer required for backpropagation.

Version 0.1.5.2

Added Random instaces for all tuple types. Same as for Binary, this
does incur a random and time dependency only from the tuple types.
Again, because these packages are a part of GHC’s boot libraries, this
is hopefully not too bad.

Added Binary instances for all tuple types. Note that this does incur a
binary dependency only because of the tuple types; however, this will
hopefully be not too much of an issue because binary is a GHC library
anyway.

Version 0.1.4.0

isoVar, isoVar2, isoVar3, and isoVarN: convenient aliases for
applying isomorphisms to BVars. Helpful for use with constructors and
deconstructors.

opIso2 and opIso3 added to Numeric.Backprop.Op, for convenience.

T0 (Unit with numeric instances) added to Numeric.Backprop.Tuple.

Internal

Completely decoupled the internal implementation from Num, which showed
some performance benefits. Mostly just to make the code slightly cleaner,
and to prepare for some day potentially decoupling the external API from
Num as well.

Version 0.1.1.0

Added canonical strict tuple types with Num instances, in the module
Numeric.Backprop.Tuple. This is meant to be a band-aid for the problem
of orphan instances and potential mismatched tuple types.

Fixed bug in collectVar that occurs if container sizes change

Internal

Internal tweaks to the underlying automatic differentiation types that
decouple backpropagation from Num, internally. Num is now just used
externally as a part of the API, which might someday be made optional.

Version 0.0.3.0

Removed samples as registered executables in the cabal file, to reduce
dependences to a bare minimum. For convenience, build script now also
compiles the samples into the local directory if stack is installed.

Added experimental (unsafe) combinators for working with GADTs with
existential types, withGADT, to Numeric.Backprop module.

Fixed broken links in changelog.

Version 0.0.2.0

Removed all traces of Summer/Unity from the library, eliminating a
whole swath of “explicit-Summer”/“explicit-Unity” versions of functions.
As a consequence, the library now only works with Num instances. The
API, however, is now much more simple.