Parts | Days | Selection | Search | Downloads | Help

KY: Kybernetik

KY 2: Dynamische Systeme | Neuronale Netzwerke

KY 2.2: Talk

Tuesday, March 16, 1999, 09:45–10:30, ZO 3

Das Gesetz der maximalen Entropie in natürlichen Sprachen als Folge der Struktur des neuronalen Sprachnetzwerks — •Wolfgang Hilberg — TUD,Fachgebiet Digitaltechnik, Fachbereich 18, Elektrotechnik und Informationstechnik, Merckstr. 25, D-64283 Darmstadt

Compared with Zipf diagrams the newly developed association matrices are a superior tool for analyzing natural language text structures and for recognizing the underlying neural network structures of cortex. Even for texts of different languages the measured matrix populations are very similiar and describe networks of high connectivity with unique and simple mathematical structure, verifying in an illustrative way the well known but also vehemently contested property of maximum entropy, once published by Mandelbrot. After all, the analysis can be confirmed by calculating for the first time the numerical value of this maximum entropy and by comparing it with measured numerical values. The good agreement of these calculated and measured values and the agreement of other measured and calculated curves, where frequency of words is plotted against its length may be a further confirmation. In future the most interesting fact will surely be that all considerations which were made at the word level can be transfered also to other levels of abstraction (thought levels) either in natural language text or in neural networks. In a model system, which was proposed recently, the emerging question can also be answered, in which cases natural language texts have contrary, i.e. minimum and maximum, entropies (Shannon, Mandelbrot).

100% | Screen Layout | Deutsche Version | Contact/Imprint/Privacy
DPG-Physik > DPG-Verhandlungen > 1999 > Heidelberg