1471-2202-14-S1-P60 1471-2202 Poster presentation <p>Beyond dynamical mean-field theory of neural networks</p> MuratoriMassimiliano CessacBrunobruno.cessac@inria.fr

NeuroMathComp team (INRIA, UNSA LJAD), Sophia Antipolis, France

BMC Neuroscience <p>Abstracts from the Twenty Second Annual Computational Neuroscience Meeting: CNS*2013</p>Meeting abstracts<p>Twenty Second Annual Computational Neuroscience Meeting: CNS*2013</p>Paris, France13-18 July 2013http://www.cnsorg.org/cns-2013-paris1471-2202 2013 14 Suppl 1 P60 http://www.biomedcentral.com/1471-2202/14/S1/P60 10.1186/1471-2202-14-S1-P60
872013 2013Muratori and Cessac; licensee BioMed Central Ltd.This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

We consider a set of N firing rate neurons with discrete time dynamics and a leak term γ. The nonlinearity of the sigmoid is controlled by a parameter g and each neuron has a firing threshold θ, Gaussian distributed (thresholds are uncorrelated). The network is fully connected with correlated Gaussian random synaptic weights, with mean zero and covariance matrix C/N. When synaptic weights are uncorrelated the dynamic mean field theory developed in 123 allows us to draw the bifurcation diagram of the model in the thermodynamic limit (N tending to infinity): in particular there is sharp transition from fixed point to chaos characterized by the maximum Lyapunov exponent, which is known analytically in the thermodynamic limit. The bifurcation diagram is drawn in Figure 1 A. However, mean-field theory is exact only in the thermodynamic limit and when synaptic weights are uncorrelated. What are the deviations from mean-field theory observed when one departs from these hypotheses? We have first studied the finite size dynamics. For finite N the maximal Lyapunov exponent has a plateau at 0 corresponding to a transition to chaos by quasi-periodicity where dynamics is at the edge of chaos (Figure 1 B). This plateau disappears in the thermodynamic limit. Thus, mean-field theory neglects an important finite-sized effect since neuronal dynamics at the edge of chaos has strong implications on learning performances of the network 4. We also studied the effect of a weak correlation (of amplitude ε) on dynamics. Even, when ε is small one detects an important deviation on the maximal Lyapunov exponent (Figure 1 C).

<p>Figure 1</p>

(A) Bifurcation map

(A) Bifurcation map. 1 : one stable fixed point; 2 two stable fixed points; 3 one fixed point and one strange attractor; 4 one strange attractor. (B) Finite N and Mean-Field Maximal Lyapunov exponent (θ = 0.1, γ = 0). (C) Finite N Maximal Lyapunov exponent with weak correlation (ε = 0.01 and Mean-Field Maximal Lyapunov Exponent without correlation (ε = 0).

Acknowledgements

This work was supported by INRIA, ERC-NERVI number 227747, KEOPS ANR-CONICYT and European Union Project # FP7-269921 (BrainScales), Renvision grant agreement N 600847 and Mathemacs FP7-ICT_2011.9.7.

<p>Mean-field equations, bifurcation map and route to chaos in discrete time neural networks</p>CessacBDoyonBQuoyMSamuelidesMPhysica D199474244410.1016/0167-2789(94)90024-8<p>Increase in Complexity in Random Neural Networks</p>CessacBJ Phys I France1995540943210.1051/jp1:1995135<p>Large deviations and mean-field theory for asymmetric random recurrent neural networks</p>MoynotOSamuelidesMProbability Theory and Related Fields20021234175Springer-Verlag10.1007/s004400100182<p>Edge of Chaos and Prediction of Computational Performance for Neural Circuit Models</p>LegensteinRMaassWNeural Networks20072032333410.1016/j.neunet.2007.04.01717517489