Abstract: Inverse problems convert indirect measurements into useful characterizations of the parameters of a physical system. Parameters are typically related to indirect measurements by a system of partial differential equations (PDEs), which are complicated and expensive to evaluate. Available indirect data are often limited, noisy, and subject to natural variation, while the unknown parameters of interest are often high dimensional, or infinite dimensional in principle. Solution of the inverse problem, along with prediction and uncertainty assessment, can be cast in a Bayesian setting and thus naturally tackled with Markov chain Monte Carlo (MCMC) and other posterior sampling methods. However, designing scalable and efficient sampling methods for high dimensional inverse problems that involve expensive PDE evaluations poses a significant challenge. This mini-symposium presents recent advances in sampling approaches for large scale inverse problems.

MS-Th-D-36-113:30--14:00High dimensional non-Gaussian Bayesian inference with transport mapsSpantini, Alessio (MIT)Marzouk, Youssef (Massachusetts Inst. of Tech.)Abstract: Characterizing high dimensional posterior distributions in the context of nonlinear and non-Gaussian Bayesian inverse problems is a well-known challenging task. A recent approach to this problem seeks a deterministic transport map from a reference distribution to the posterior. Thus posterior samples can easily be obtained by pushing forward reference samples through the map. In this talk, we address the computation of the transport map in high dimensions. In particular, we propose a scalable adaptive algorithm

MS-Th-D-36-214:00--14:30Advances in Generalised Metropolis-Hastings AlgorithmsCalderhead, Ben (Imperial College London)Abstract: A recent generalization of the Metropolis−Hastings algorithm allows for parallelizing a single chain using existing MCMC methods (Calderhead, PNAS, 2014). The construction involves proposing multiple points in parallel, then defining and sampling from a finite-state Markov chain on the proposed points such that the overall procedure has the correct target density as its stationary distribution. In this talk I'll discuss this algorithm and some of the most recent advances employing this approach.

MS-Th-D-36-314:30--15:00Multilevel Sequential Monte Carlo SamplersLaw, Kody (ORNL)TEMPONE, RAUL (KING ABDULLAH Univ. OF Sci. & Tech.)Abstract: The approximation of the posterior distribution associated to a Bayesian inverse problem is decomposed into a hierarchy consisting of a telescoping sum of increments with decreasing variance and increasing cost, and then probed with a sequential Monte Carlo sampler. The number of samples per level is optimized with respect to cost for a fixed root-mean square error, which then optimally scales as the inverse square-root of the cost.

MS-Th-D-36-415:00--15:30Operator-weighted MCMC on function spacesCui, Tiangang (MIT)Law, Kody (ORNL)Marzouk, Youssef (Massachusetts Inst. of Tech.)Abstract: Many inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. We introduce a family of operator-weighted MCMC samplers that can adapt to the intrinsically low rank and locally complex structure of the posterior distribution while remaining well defined on function space. Posterior sampling in a nonlinear inverse problem and a conditioned diffusion process are used to demonstrate the efficiency of these dimension-independent operator-weighted samplers.