Subset based least squares subspace regression in RKHS

  • Authors:
  • L. Hoegaerts;J. A. K. Suykens;J. Vandewalle;B. De Moor

  • Affiliations:
  • Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;Katholieke Universiteit Leuven, Department of Electrical Engineering, ESAT-SCD-SISTA Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium

  • Venue:
  • Neurocomputing
  • Year:
  • 2005

Quantified Score

Hi-index 0.02

Visualization

Abstract

Kernel based methods suffer from exceeding time and memory requirements when applied on large datasets since the involved optimization problems typically scale polynomially in the number of data samples. As a remedy, some least squares methods on one hand only reduce the number of parameters (for fast training), on the other hand only work on a reduced set (for fast evaluation). Departing from the Nystrom based feature approximation, via the fixed-size LS-SVM model, we propose a general regression framework, based on restriction of the search space to a subspace and a particular choice of basis vectors in feature space. In the general model both reduction aspects are unified and become explicit model choices. This allows to accommodate kernel Partial Least Squares and kernel Canonical Correlation analysis for regression with a sparse representation, which makes them applicable to large data sets, with little loss in accuracy.