A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Autre Publication Année : 2008

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

Résumé

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
Fichier principal
Vignette du fichier
NECO-05-007-530-Source.pdf (355.58 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

inria-00149181 , version 1 (24-05-2007)
inria-00149181 , version 2 (07-04-2008)

Identifiants

  • HAL Id : inria-00149181 , version 2
  • ARXIV : 0705.3690

Citer

Benoit Siri, Hugues Berry, Bruno Cessac, Bruno Delord, Mathias Quoy. A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks. 2008. ⟨inria-00149181v2⟩
526 Consultations
347 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More