We construct a new framework for accelerating MCMC algorithms for sampling from posterior distributions in the context of computationally intensive models. We proceed by constructing local surrogates of the forward model within the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. Our work builds upon previous work in surrogate-based inference by exploiting useful convergence characteristics of local surrogates. We prove the ergodicity of our approximate Markov chain and show that asymptotically it samples from the exact posterior density of interest. We describe variations of the algorithm that construct either local polynomial approximations or Gaussian process regressors, thus spanning two important classes of surrogate models. Numerical experiments demonstrate significant reductions in the number of forward model evaluations used in representative ODE or PDE inference problems, in both real and synthetic data examples.

This is joint work with Youssef Marzouk, Natesh Pillai, and Aaron Smith.

Share this:

Related

This entry was posted on November 28, 2013 at 05:50 and is filed under MCMSki IV abstract. You can follow any responses to this entry through the RSS 2.0 feed.
Responses are currently closed, but you can trackback from your own site.