autograd

This generates a new autograd.nn.AutoModule.[moduleName] -- that takes a suitable forward function executed in :updateOutput -- it automatically deals with the updateGradInput and accGradParameters differentiation

> import autograd.numpy as np # Thinly-wrapped numpy

> from autograd import grad # The only autograd function you may ever need