Dresden 2026 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
DY: Fachverband Dynamik und Statistische Physik
DY 41: Poster: Nonlinear Dynamics, Granular Matter, and Machine Learning
DY 41.15: Poster
Wednesday, March 11, 2026, 15:00–18:00, P5
Neural-Network-Driven Sequential Quasi Monte Carlo Sampling for Bayesian Inference with Complex Posteriors — •Andreas Panagiotopoulos1, Javed Mudassar2, Jens-Uwe Repke2, Georg Brösigke2, and Sebastian Matera1 — 1Fritz-Haber-Institut der MPG, Berlin — 2Technische Universität Berlin
Bayesian inference has seen an increasing popularity in recent years, because it overcomes some of the limitations of classical parameter fitting. The price is the need to sample the potentially high-dimensional parameter space, which can become computationally demanding for complex forward models. This is enhanced when faced with uninformative priors but accurate data in conjunction with highly sensitive and nonlinear models. Posteriors will be sharply localized and of complex nature resulting in challenging sampling problems. To address this, we have developed a sequential importance sampling approach which utilizes neural network normalizing flows to exploit the superior sampling properties of Quasi Monte Carlo (MC) techniques. An initial sampling from a simple distribution is employed to learn the first layer of the normalizing flow, i.e. a first coarse approximation of the posterior distribution. Using this flow to quasi MC sample from that distribution provides the data to learn the next layer and so on. As at early stages appropriate sampling of the posterior is intractable, a tempering strategy is employed to make this strategy more robust. We demonstrate the approach on a realistic problem with complex posteriors stemming from the field of chemical kinetics.
Keywords: Bayesian inference; Chemical kinetics; Neural posterior estimation; Normalizing flows; Likelihood intractable models
