Abstract: Practical implementations of Bayesian inference are often limited to approximation methods that only slowly explore the posterior distribution. By taking advantage of the curvature of the posterior, however, Hamiltonian Monte Carlo (HMC) efficiently explores even the most highly contorted distributions. In this talk I will review the foundations of and recent developments within HMC, concluding with a discussion of Stan, a powerful inference engine that utilizes HMC, automatic differentiation, and adaptive methods to minimize user input.

I’m very fortunate that just when my work required the use of STAN (which I was only theoretically familiar with) two talks come into existence to get me up to speed. First the presentation at the LA R User Group last week, and now this one.

UCLA CHS – Center for Health Sciences. The building houses the School of Public Health, Medical School, Dental School, and used to house the UCLA hospital. The public health school has a Department of Community Health Sciences, another CHS, so the confusion is understandable.