Kernel principal component analysis
Advances in kernel methods
Semi-Supervised Learning on Riemannian Manifolds
Machine Learning
Learning from Examples as an Inverse Problem
The Journal of Machine Learning Research
Learning Bounds for Kernel Regression Using Effective Data Dimensionality
Neural Computation
On regularization algorithms in learning theory
Journal of Complexity
Statistical properties of kernel principal component analysis
Machine Learning
On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA
IEEE Transactions on Information Theory
Efficient computation of PCA with SVD in SQL
Proceedings of the 2nd Workshop on Data Mining using Matrices and Tensors
A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis
The Journal of Machine Learning Research
Hi-index | 0.00 |
In this paper we investigate the regularization property of Kernel Principal Component Analysis (KPCA), by studying its application as a preprocessing step to supervised learning problems. We show that performing KPCA and then ordinary least squares on the projected data, a procedure known as kernel principal component regression (KPCR), is equivalent to spectral cut-off regularization, the regularization parameter being exactly the number of principal components to keep. Using probabilistic estimates for integral operators we can prove error estimates for KPCR and propose a parameter choice procedure allowing to prove consistency of the algorithm.