Gaussian Processes for Stationary and Nonstationary
Smoothing: An Overview and Some Applications

Modeling regression functions nonparametrically in a Bayesian manner requires
a prior over function spaces. One popular choice of prior has been the
Gaussian process. I will first provide an introduction to Gaussian process
regression models, including an overview of how Gaussian processes have been
used in the spatial statistics literature under the name of kriging. The
success of the Gaussian process approach hinges on the covariance function,
used to describe how the response function is related at different points in
the space of the predictor variables, thereby imposing smoothness constraints
on the regression function. Most previous work in both the machine learning
and spatial statistics communities has focused on stationary covariance
functions. I extend earlier work using nonstationary covariance functions
that allow the regression function to vary in smoothness within the space of
the predictor variables. This allows one to more accurately fit functions
that are smooth in some areas and wiggly in others. In contrast, the use of
stationary covariance functions results in oversmoothing in some parts of the
space and undersmoothing in other parts.

There are several methods for fitting the model, based on Markov Chain Monte
Carlo algorithms. Preliminary results suggest that the nonstationary Gaussian
process model is competitive with successful spline-based methods when using a
small number of predictor variables. However, a fully Bayesian treatment is
difficult with more than a few hundred data points. Some work in the
literature suggests ways to speed up the computations, but the fundamental
limitation of the method remains the O(n^3) computations involved in working
with the prior covariance matrix.