Efficiently updating and tracking the dominant kernel principal components

  • Authors:
  • L. Hoegaerts;L. De Lathauwer;I. Goethals;J. A. K. Suykens;J. Vandewalle;B. De Moor

  • Affiliations:
  • Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;ETIS, UMR 8051 (CNRS, ENSEA, UCP), Avenue du Ponceau 6, BP 44, F-95014 Cergy-Pontoise Cedex, France;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium

  • Venue:
  • Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit these methods in numerous applications. In this paper we present an efficient incremental approach for fast calculation of the dominant kernel eigenbasis, which allows us to track the kernel eigenspace dynamically. Experiments show that our updating scheme delivers a numerically stable and accurate approximation for eigenvalues and eigenvectors at every iteration in comparison to the batch algorithm.