SOE 17: Focus Session: Physics of AI II (joint session SOE/DY)
Friday, March 13, 2026, 09:30–12:45, GÖR/0226
 |
09:30 |
SOE 17.1 |
Invited Talk:
What can we learn from neural quantum states? — Brandon Barton, Juan Carrasquilla, Anna Dawid, Antoine Georges, Megan Schuyler Moss, Alev Orfi, Christopher Roth, Dries Sels, Anirvan Sengupta, and •Agnes Valenti
|
|
|
 |
10:00 |
SOE 17.2 |
The NN/QFT correspondence — •Ro Jefferson
|
|
|
 |
10:15 |
SOE 17.3 |
Online Learning Dynamics and Neural Scaling Laws for a Perceptron Classification Problem — •Yoon Thelge, Marcel Kuhn, and Bernd Rosenow
|
|
|
 |
10:30 |
SOE 17.4 |
Power-Law Correlations in Language: Criticality vs. Hierarchical Generative Structure — •Marcel Kühn, Max Staats, and Bernd Rosenow
|
|
|
 |
10:45 |
SOE 17.5 |
Dynamics of neural scaling laws in random feature regression — •Jakob Kramp, Javed Lindner, and Moritz Helias
|
|
|
| |
11:00 |
|
15 min. break
|
|
|
 |
11:15 |
SOE 17.6 |
Invited Talk:
Creativity in generative AI — •matthieu wyart
|
|
|
 |
11:45 |
SOE 17.7 |
Understanding Generative Models via Interactions — •Claudia Merger, Alexandre Rene, Kirsten Fischer, Peter Bouss, Sandra Nestler, David Dahmen, Carsten Honerkamp, Moritz Helias, and Sebastian Goldt
|
|
|
 |
12:00 |
SOE 17.8 |
From Kernels to Features: A Multi-Scale Adaptive Theory of Feature Learning — •Javed Lindner, Noa Rubin, Kirsten Fischer, David Dahmen, Inbar Seroussi, Zohar Ringel, Michael Krämer, and Moritz Helias
|
|
|
 |
12:15 |
SOE 17.9 |
Statistical physics of deep learning: Optimal learning of a multi-layer perceptron near interpolation — Jean Barbier, Francesco Camilli, Minh-Toan Nguyen, Mauro Pastore, and •Rudy Skerk
|
|
|
 |
12:30 |
SOE 17.10 |
Phase Transitions as Rank Transitions: Connecting Data Complexity and Cascades of Phase Transitions in analytically tractable Neural Network Models — •Björn Ladewig, Ibrahim Talha Ersoy, and Karoline Wiesner
|
|
|