Skip to Main content Skip to Navigation
Journal articles

Recursive Principal Components Analysis

Thomas Voegtlin 1
1 CORTEX - Neuromimetic intelligence
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.
Document type :
Journal articles
Complete list of metadatas

Cited literature [21 references]  Display  Hide  Download

https://hal.inria.fr/inria-00000222
Contributor : Thomas Voegtlin <>
Submitted on : Thursday, September 15, 2005 - 3:11:14 PM
Last modification on : Thursday, January 11, 2018 - 6:19:48 AM
Long-term archiving on: : Thursday, April 1, 2010 - 10:25:57 PM

Identifiers

  • HAL Id : inria-00000222, version 1

Collections

Citation

Thomas Voegtlin. Recursive Principal Components Analysis. Neural Networks, Elsevier, 2005, 18 (8), pp.1051--1063. ⟨inria-00000222⟩

Share

Metrics

Record views

398

Files downloads

570