The GPML toolbox implements approximate inference algorithms for Gaussian processes such as Expectation Propagation, the Laplace Approximation and Variational Bayes for a wide class of likelihood functions for both regression and classification. It comes with a big algebra of covariance and mean functions allowing for flexible modeling. The code is fully compatible to Octave 3.2.x.

Changes to previous version:

We now support inference on large datasets using the FITC approximation by Ed Snelson.
The covariance function interface had to be slightly modified.

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

We now support inference on large datasets using the FITC approximation for non-Gaussian likelihoods for EP and Laplace's approximation.
New likelihood functions: mixture likelihood, Poisson likelihood, label noise.
We added two MCMC samplers.