2005 Special issue: Recursive principal components analysis

  • Authors:
  • Thomas Voegtlin

  • Affiliations:
  • INRIA-Campus Scientifique, B.P. 239 F-54506 Vandoeuvre-Les-Nancy Cedex, France

  • Venue:
  • Neural Networks - Special issue on neural networks and kernel methods for structured domains
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.