Dresden 2026 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
AKPIK: Arbeitskreis Physik, moderne Informationstechnologie und Künstliche Intelligenz
AKPIK 2: Machine Learning Prediction and Optimization Tasks
AKPIK 2.1: Vortrag
Dienstag, 10. März 2026, 09:30–09:45, BEY/0127
Bayesian Optimization for Mixed-Variable Problems in the Natural Sciences — •Yuhao Zhang1, Ti John2, Matthias Stosiek1, and Patrick Rinke1,3 — 1School of Natural Sciences Physics Department, Technical University of Munich, Germany — 2Department of Computer Science Aalto University, FInland — 3Munich Center for Machine Learning, Germany
Optimizing expensive black-box objectives over mixed search spaces is a common challenge across the natural sciences. Bayesian optimization (BO) offers sample-efficient strategies through probabilistic surrogate models and acquisition functions. However, its effectiveness diminishes in mixed or high-cardinality discrete spaces, where gradients are unavailable and optimizing the acquisition function becomes computationally demanding. In this work, we generalize the probabilistic reparameterization (PR) approach of Daulton et al. to handle non-equidistant discrete variables, enabling gradient-based optimization in fully mixed-variable settings with Gaussian process surrogates. With real-world scientific optimization tasks in mind, we conduct systematic benchmarks on synthetic and experimental objectives to obtain an optimized kernel formulations and demonstrate the robustness of our generalized PR implementation. We additionally show that, when combined with a modified BO workflow, our approach can efficiently optimize highly discontinuous and discretized objective landscapes. This work establishes a practical BO framework for addressing fully mixed optimization problems encountered in the natural sciences.
Keywords: Bayesian Optimization; Gaussian Process; Mixed-Variable Optimization
