Recursive Principal Components Analysis - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue Neural Networks Année : 2005

Recursive Principal Components Analysis

Thomas Voegtlin
  • Fonction : Auteur
  • PersonId : 830141

Résumé

A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.
Fichier principal
Vignette du fichier
article.pdf (1.41 Mo) Télécharger le fichier
Loading...

Dates et versions

inria-00000222 , version 1 (15-09-2005)

Identifiants

  • HAL Id : inria-00000222 , version 1

Citer

Thomas Voegtlin. Recursive Principal Components Analysis. Neural Networks, 2005, 18 (8), pp.1051--1063. ⟨inria-00000222⟩
222 Consultations
604 Téléchargements

Partager

Gmail Facebook X LinkedIn More