Dresden 2026 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
CPP: Fachverband Chemische Physik und Polymerphysik
CPP 9: French-German Session: Simulation Methods and Modeling of Soft Matter I
CPP 9.6: Vortrag
Montag, 9. März 2026, 16:30–16:45, ZEU/LICH
Fine-Tuning Unifies Foundational Machine-learned Interatomic Potential Architectures at ab intio Accuracy — •Christian Dreßler, Jonas Hänseroth, Nawaz Qaisrani, and Aaron Flötotto — TU Ilmenau, Institute of Physics, Germany
Machine-learned force fields (MLFFs) have enabled ab initio quality molecular dynamics at speedups of several orders of magnitude. Foundation-model MLFFs, trained on extremely large and diverse datasets, aim to provide broadly transferable force predictions, yet their accuracy remains system-dependent. We investigate five state-of-the-art foundation models (MACE, GRACE, ORB, MATTERSIM, SEVENNET) and fine-tune them for seven chemically diverse systems.[1] Our results show that foundation models provide a useful baseline but still deviate significantly from AIMD. Fine-tuning consistently improves accuracy across all frameworks and yields predictions that closely match AIMD reference data, while also reducing performance differences between models. The final accuracy is largely independent of the underlying architecture. These findings suggest that, after fine-tuning, model choice is no longer the main bottleneck. Prioritizing inference speed and computational efficiency may therefore offer the greatest benefit for accelerating the practical use of ML-based interatomic potentials in materials modeling.
[1] Hänseroth, ... , Dreßler, Fine-Tuning Unifies Foundational Machine-Learned Interatomic Potential Architectures at ab initio Accuracy, https://doi.org/10.48550/arXiv.2511.05337
Keywords: machine-learned interatomic potentials; machine-learned force fields; ab initio molecular dynamics; foundation model; fine-tuning
