DPG Phi
Verhandlungen
Verhandlungen
DPG

Dresden 2003 – scientific programme

Parts | Days | Selection | Search | Downloads | Help

DY: Dynamik und Statistische Physik

DY 46: Poster

DY 46.57: Poster

Thursday, March 27, 2003, 15:30–18:00, P1

A stochastic Hebb-like learning rule for neural networks — •Frank Emmert-Streib — Universität Bremen, Institut für Theoretische Physik, D-28334 Bremen

The classical learning rule for neural networks was proposed in 1949 by D. Hebb. He claimed that the synaptic strength increases if pre- and postsynaptic neuron fire together within a small time window. Due to experimental results of [Bliss and Lomo, 1973; Markram et al., 1997] this proposal was confirmed. More precisely the synaptic plasticity described by the classical Hebbian learning rule is called long-term potentiation (LTP) which is only a special case of synaptic plasticity. Depending on the time scale or the conditions which induced the plasticity a lot of different forms coexist.
Here we present a stochastic Hebb-like learning rule for neural networks which is capable of explaining heterosynaptic long-term depression (LTD) in a qualitative way. The model is based on theoretical work of [Chialvo and Bak, 1999; Klemm, Bornholdt and Schuster, 2000] and combines their properties by the stochastic optimization method of [Boettcher, 2001]. We demonstrate its usage by means of a multilayer neural network.

100% | Mobile Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 2003 > Dresden