Large Deviations of an Ergodic Synchronous Neural Network with Learning - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2014

Large Deviations of an Ergodic Synchronous Neural Network with Learning

Résumé

In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a lattice Z^d. The neurons are subject to correlated external noise, which is modelled as an infinite-dimensional stochastic integral. The probability law governing the noise is strictly stationary, and we are therefore able to find a LDP for the probability laws Pi^n governing the ergodic empirical measure mu^n generated by the neurons in a cube of length (2n+1) as n asymptotes to infinity. We use this LDP to determine an LDP for the neural network model. The connection weights between the neurons evolve according to a learning rule / neuronal plasticity, and these results are adaptable to a large variety of specific types of neural network. This LDP is of great use in the mathematical modelling of neural networks, because it allows a quantification of the likelihood of the system deviating from its limit, and also a determination of which direction the system is likely to deviate. The work is also of interest because there are nontrivial correlations between the neurons even in the asymptotic limit, thereby presenting itself as a generalisation of traditional mean-field models.
Fichier principal
Vignette du fichier
infiniteldims.pdf (472.69 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01100020 , version 1 (05-01-2015)

Identifiants

Citer

Olivier Faugeras, James Maclaurin. Large Deviations of an Ergodic Synchronous Neural Network with Learning. 2014. ⟨hal-01100020⟩
222 Consultations
196 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More