Skip to Main content Skip to Navigation
New interface
Journal articles

Recursive Principal Components Analysis

Thomas Voegtlin 1 
1 CORTEX - Neuromimetic intelligence
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.
Document type :
Journal articles
Complete list of metadata

Cited literature [21 references]  Display  Hide  Download
Contributor : Thomas Voegtlin Connect in order to contact the contributor
Submitted on : Thursday, September 15, 2005 - 3:11:14 PM
Last modification on : Friday, February 4, 2022 - 3:22:31 AM
Long-term archiving on: : Thursday, April 1, 2010 - 10:25:57 PM


  • HAL Id : inria-00000222, version 1



Thomas Voegtlin. Recursive Principal Components Analysis. Neural Networks, 2005, 18 (8), pp.1051--1063. ⟨inria-00000222⟩



Record views


Files downloads