Dimensionality reduction and generalization

  • Authors:
  • Sofia Mosci;Lorenzo Rosasco;Alessandro Verri

  • Affiliations:
  • Università di Genova, Genova, Italy;Università di Genova, Genova, Italy;Università di Genova, Genova, Italy

  • Venue:
  • Proceedings of the 24th international conference on Machine learning
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we investigate the regularization property of Kernel Principal Component Analysis (KPCA), by studying its application as a preprocessing step to supervised learning problems. We show that performing KPCA and then ordinary least squares on the projected data, a procedure known as kernel principal component regression (KPCR), is equivalent to spectral cut-off regularization, the regularization parameter being exactly the number of principal components to keep. Using probabilistic estimates for integral operators we can prove error estimates for KPCR and propose a parameter choice procedure allowing to prove consistency of the algorithm.