Dresden 2026 – wissenschaftliches Programm
Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe
DY: Fachverband Dynamik und Statistische Physik
DY 58: Focus Session: Physics of AI – Part II (joint session SOE/DY)
DY 58.8: Vortrag
Freitag, 13. März 2026, 12:00–12:15, GÖR/0226
From Kernels to Features: A Multi-Scale Adaptive Theory of Feature Learning — •Javed Lindner1,2, Noa Rubin5, Kirsten Fischer1,6, David Dahmen1, Inbar Seroussi4, Zohar Ringel5, Michael Krämer3, and Moritz Helias1,2 — 1Institute for Advanced Simulation (IAS-6), Computational and Systems Neuroscience, Jülich Research Centre, Jülich, Germany — 2Department of Physics, RWTH Aachen University, Aachen, Germany — 3Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University, Aachen, Germany — 4Department of Applied Mathematics, School of Mathematical Sciences, Tel-Aviv University, Tel-Aviv, Israel — 5he Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem, Israel — 6RWTH Aachen University, Aachen, Germany
Feature learning in neural networks is crucial for their expressive power and inductive biases, moti- vating various theoretical approaches. Some ap- proaches describe network behavior after train- ing through a change in kernel scale from initial- ization, resulting in a generalization power com- parable to a Gaussian process. Conversely, in other approaches training results in the adapta- tion of the kernel to the data, involving directional changes to the kernel. The relationship and re- spective strengths of these two views have so far remained unresolved. This work presents a theo- retical framework of multi-scale adaptive feature learning bridging these two views. Using methods from statistical mechanics, we derive analytical expressions for network output statistics which are valid across scaling regimes and in the continuum between them.
Keywords: Statistical Physics; Feature Learning; Neural Networks; Bayesian posterior; Kernel
