Original Contribution: Convergence analysis of local feature extraction algorithms

  • Authors:
  • K. Hornik;C. -M. Kuan

  • Affiliations:
  • Institut für Statistik und Wahrscheinlichkeitstheorie, Technische Universität Wien, Austria;University of Illinois at Urbana-Champaign, USA

  • Venue:
  • Neural Networks
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

We investigate the asymptotic behavior of a general class of on-line Principal Component Analysis (PCA) learning algorithms, focusing our attention on the analysis of two algorithms which have recently been proposed and are based on strictly local learning rules. We rigorously establish that the behavior of the algorithms is intimately related to an ordinary differential equation (ODE) which is obtained by suitably averaging over the training patterns, and study the equilibria of these ODEs and their local stability properties. Our results imply, in particular, that local PCA algorithms should always incorporate hierarchical rather than more competitive, symmetric decorrelation, for reasons of superior performance of the algorithms.