December
SVBMC: Fast post-processing Bayesian inference with noisy evaluations of the likelihood
Grégoire Clarté University of Helsinki, University of Edinburgh
In many cases, the exact likelihood is unavailable, and can only be accessed through a noisy and expensive process – for example, in Plasma Physics. Furthermore, Bayesian inference often comes in at a second moment, for example after running an optimization algorithm to find a MAP estimate. To tackle both these issues, we introduce Sparse Variational Bayesian Monte Carlo (SVBMC), a method for fast “post-processes” Bayesian inference for models with black-box and noisy likelihoods. SVBMC reuses all existing target density evaluations – for example, from previous optimizations or partial Markov Chain Monte Carlo runs – to build a sparse Gaussian process (GP) surrogate model of the log posterior density. Uncertain regions of the surrogate are then refined via active learning as needed. Our work builds on the Variational Bayesian Monte Carlo (VBMC) framework for sample-efficient inference, with several novel contributions. First, we make VBMC scalable to a large number of pre-existing evaluations via sparse GP regression, deriving novel Bayesian quadrature formulae and acquisition functions for active learning with sparse GPs. Second, we introduce noise shaping, a general technique to induce the sparse GP approximation to focus on high posterior density regions. Third, we prove theoretical results in support of the SVBMC refinement procedure. We validate our method on a variety of challenging synthetic scenarios and real-world applications. We find that SVBMC consistently builds good posterior approximations by post-processing of existing model evaluations from different sources, often requiring only a small number of additional density evaluations.
Variance reduction using control variates and importance sampling for applications in computational statistical physics
Urbain Vaes INRIA, CERMICS
The scaling of the mobility coefficient associated with two-dimensional Langevin dynamics in a periodic potential as the friction vanishes is not well understood. Theoretical results are lacking, and numerical calculation of the mobility in the underdamped regime is challenging. In the first part of this talk, I will present a new variance reduction approach based on control variates for efficiently estimating the mobility of Langevin-type dynamics, together with numerical experiments illustrating the performance of the approach.
In the second part of this talk, we study an importance sampling approach for calculating averages with respect to multimodal probability distributions. Traditional Markov chain Monte Carlo methods to this end, which are based on time averages along a realization of a Markov process ergodic with respect to the target probability distribution, are usually plagued by a large variance due to the metastability of the process. The estimator we study is based on an ergodic average along a realization of an overdamped Langevin process for a modified potential. We obtain an explicit expression for the optimal biasing potential in dimension 1 and propose a general numerical approach for approximating the optimal potential in the multi-dimensional setting.