DPG Phi
Verhandlungen
Verhandlungen
DPG

SMuK 2023 – scientific programme

Parts | Days | Selection | Search | Updates | Downloads | Help

AKPIK: Arbeitskreis Physik, moderne Informationstechnologie und Künstliche Intelligenz

AKPIK 5: AI Topical Day – Neural Networks and Computational Complexity (joint session MP/AKPIK)

AKPIK 5.2: Invited Talk

Wednesday, March 22, 2023, 11:30–12:00, ZEU/0250

Deep neural networks and the renormalization group — •Ro Jefferson1, Johanna Erdmenger2, and Kevin Grosvenor31Utrecht University — 2University of Würzburg — 3Leiden University

Despite the success of deep neural networks (DNNs) on an impressive range of tasks, they are generally treated as black boxes, with performance relying on heuristics and trial-and-error rather than any explanatory theoretical framework. Recently however, techniques and ideas from physics have been applied to DNNs in the hopes of distilling the underlying fundamental principles. In this talk, I will discuss some interesting parallels between DNNs and the renormalization group (RG). I will briefly reivew RG in the context of a simple lattice model, where subsequent RG steps are analogous to subsequent layers in a DNN, in that effective interactions arise after marginalizing hidden degrees of freedom/neurons. I will then quantify the intuitive idea that information is lost along the RG flow by computing the relative entropy in both the Ising model and a feedforward DNN. One finds qualitatively identical behaviour in both systems, in which the relative entropy increases monotonically to some asymptotic value. On the QFT side, this confirms the link between relative entropy and the c-theorem, while for machine learning, it may have implications for various information maximization methods, as well as disentangling compactness and generalizability.

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2023 > SMuK