A Mathematical Analysis of Memory Lifetime in a simple Network Model of Memory
Résumé
We study the learning of an external signal by a neural network and the time to forget it when this network is submitted to noise. The presentation of an external stimulus to the recurrent network of binary neurons may change the state of the synapses. Multiple presentations of a unique signal leads to its learning. Then, during the forgetting time, the presentation of other signals (noise) may also modify the synaptic weights. We construct an estimator of the initial signal thanks to the synaptic currents and define by this way a probability of error. In our model, these synaptic currents evolve as Markov chains. We study the dynamics of these Markov chains and obtain a lower bound on the number of external stimuli that the network can receive before the initial signal is considered as forgotten (probability of error above a given threshold). Our results hold for finite size networks as well as in the large size asymptotic. They are based on a finite time analysis rather than large time asymptotic. We finally present numerical illustrations of our results.
Origine : Fichiers produits par l'(les) auteur(s)