DPG Phi
Verhandlungen
Verhandlungen
DPG

Dresden 2026 – scientific programme

Parts | Days | Selection | Search | Updates | Downloads | Help

DY: Fachverband Dynamik und Statistische Physik

DY 33: Statistical Physics of Biological Systems I (joint session DY/BP)

DY 33.8: Talk

Wednesday, March 11, 2026, 11:30–11:45, ZEU/0114

Population sparseness in recurrent spiking neural networks — •Jakob Stubenrauch1,2, Naomi Auer3, Richard Kempter2,3,4, and Benjamin Lindner1,21Physics Department HU Berlin — 2Bernstein Center for Computational Neuroscience Berlin — 3Institute for Theoretical Biology HU Berlin — 4Einstein Center for Neurosciences Berlin

It is long known that in association tasks for neural networks, the fraction of active neurons in patterns to be associated plays an important role. Specifically, the number of patterns that can be simultaneously remembered grows when the information content per pattern is decreased. In binary networks, this content can be constrained by the population sparseness (one minus fraction of active neurons). For neurons with graded activity, population sparseness can be quantified by the Treves-Rolls measure or by the Gini coefficient. Here, we present results on the spontaneous and evoked population sparseness in different variants of recurrent neural networks of integrate-and-fire neurons. We find that the type of competition between neurons plays an important role and discuss that neurons in fully disordered networks can, in a mean field limit, only compete across a low-dimensional effective inhibition hub. We showcase the relevance of our findings for association tasks in spiking neural networks.

Keywords: Spiking neural networks; Associative memory; Mean field theory; Sparseness

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2026 > Dresden