Dresden 2026 – scientific programme
Parts | Days | Selection | Search | Updates | Downloads | Help
DY: Fachverband Dynamik und Statistische Physik
DY 58: Focus Session: Physics of AI – Part II (joint session SOE/DY)
DY 58.10: Talk
Friday, March 13, 2026, 12:30–12:45, GÖR/0226
Phase Transitions as Rank Transitions: Connecting Data Complexity and Cascades of Phase Transitions in analytically tractable Neural Network Models — •Björn Ladewig, Ibrahim Talha Ersoy, and Karoline Wiesner — Institute of Physics and Astronomy, University of Potsdam, Germany
Tuning the L2-regularization strength in neural networks can result in a cascade of (zero-temperature) phase transitions between regimes of increasing accuracy. This phenomenology was previously numerically observed and linked to a basin structure of the error landscape formed by the underlying data [1]. At the level of analytically tractable models, we (i) establish the existence of cascades of transitions for those models, (ii) give meaning to the transitions in terms of the ordered onset of "learned eigendirections" of the underlying data distribution; and (iii) link the phase transitions and corresponding accuracy regimes to saddle points of the error landscape.
[1] I. Talha Ersoy and Karoline Wiesner. Exploring l2-phase transitions on error landscapes. In ICML, Workshop on High-dimensional Learning Dynamics 2025. https://openreview.net/forum?id=AkQNtAw09u
Keywords: Deep Neural Network; Phase Transition; Regularization
