Skip to Main content Skip to Navigation
Other publications

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks

Benoit Siri 1 Hugues Berry 1, * Bruno Cessac 2, 3, 4 Bruno Delord 5 Mathias Quoy 6
* Corresponding author
1 ALCHEMY - Architectures, Languages and Compilers to Harness the End of Moore Years
LRI - Laboratoire de Recherche en Informatique, UP11 - Université Paris-Sud - Paris 11, Inria Saclay - Ile de France, CNRS - Centre National de la Recherche Scientifique : UMR8623
3 ODYSSEE - Computer and biological vision
DI-ENS - Département d'informatique de l'École normale supérieure, CRISAM - Inria Sophia Antipolis - Méditerranée , ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt, ENPC - École des Ponts ParisTech
Abstract : We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule including passive forgetting and different time scales for neuronal activity and learning dynamics. Previous numerical works have reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on the neural network evolution. Furthermore, we show that the sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
Complete list of metadatas

Cited literature [22 references]  Display  Hide  Download

https://hal.inria.fr/inria-00149181
Contributor : Hugues Berry <>
Submitted on : Monday, April 7, 2008 - 10:51:24 AM
Last modification on : Tuesday, September 22, 2020 - 3:52:32 AM
Long-term archiving on: : Tuesday, September 21, 2010 - 4:15:00 PM

Files

NECO-05-007-530-Source.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00149181, version 2
  • ARXIV : 0705.3690

Citation

Benoit Siri, Hugues Berry, Bruno Cessac, Bruno Delord, Mathias Quoy. A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks. 2008. ⟨inria-00149181v2⟩

Share

Metrics

Record views

1032

Files downloads

608