May

Time and date

4PM Friday, May 24, 2024

Salle 06, PariSanté Campus


Importance sampling with free energies and autoencoders for multimodal probability distributions

Gabriel Stoltz CERMICS, Ecole des Ponts and Matherials team-project, Inria Paris

Sampling high dimensional probability measures is often made difficult by the multimodality of the target probability distribution. The MCMC scheme under consideration needs to pass through low probability regions to switch from one mode to another, which is a rare event. An approach to making these transitions less rare is to identify a few selected (nonlinear) degrees of freedom of the system, which are at the origin of the slow mixing behavior, compute the associated free energies, and perform some importance sampling based on the latter function. This was considered in [1], based on manually selected variables (mostly linear function of the coordinates). Various tools have now recently been developed in molecular simulation to automatically find the most relevant nonlinear degrees of freedeom hindering sampling, based on machine learning tools such as autoencoders [2]. I will present a methodology to leverage these models for better sampling, and will also provide a mathematical analysis of the approach, relating it to principal manifolds and providing an interpretation based on conditional expectations [3]. 

[1] N. Chopin, T. Lelièvre and G. Stoltz, Free energy methods for efficient exploration of mixture posterior densities, Stat. Comput. 22(4), 897-916 (2012) 
[2] Z. Belkacemi, P. Gkeka, T. Lelièvre, G. Stoltz, Chasing collective variables using autoencoders and biased trajectories, J. Chem. Theory Comput. 18(1), 59-78 (2022)
[3] T. Lelièvre, T. Pigeon, G. Stoltz and W. Zhang, Analyzing multimodal probability measures with autoencoders, J. Phys. Chem. B 128(11), 2607-2631 (2024)

Adaptive MCMC sampling using a Metropolized PDMP sampler combined with a No-U-Turn criterion

Augustin Chevallier Université de Strasbourg

Adaptivity in MCMC algorithms is hard to achieve. In Hamiltonian Monte Carlo, for example, it is possible to tune the path length using the No-U-Turn sampler, but the numerical step size cannot be adapted; it can only be tuned. We propose here a new class of algorithm based on Metropolizing a numerical approximation of a PDMP sampler. Like HMC, these samplers require two parameters: a numerical step size and a path length. Unlike HMC, both parameters can be adapted. This paves the way for more robust sampling algorithms, especially for difficult target densities.