Structure and dynamics of random recurrent neural networks
Résumé
In contradiction with Hopfield-like networks, random recurrent neural networks (RRNN), where the couplings are random, exhibit complex dynamics (limit cycles, chaos). It is possible to store information in these netwoks through hebbian learning. Eventually, learning ``destroys'' the dynamics and leads to a fixed point attractor. We investigate here the structural change in the networks through learning, and show a ``small-world'' effect.
Loading...