Kernel principal component analysis
Advances in kernel methods
On the Eigenspectrum of the Gram Matrix and Its Relationship to the Operator Eigenspectrum
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
The Effect of the Input Density Distribution on Kernel-based Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Kernel independent component analysis
The Journal of Machine Learning Research
Ellipsoid approximation using random vectors
COLT'05 Proceedings of the 18th annual conference on Learning Theory
IEEE Transactions on Information Theory
On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA
IEEE Transactions on Information Theory
Accurate Error Bounds for the Eigenvalues of the Kernel Matrix
The Journal of Machine Learning Research
Dimensionality reduction and generalization
Proceedings of the 24th international conference on Machine learning
Generalization Bounds for K-Dimensional Coding Schemes in Hilbert Spaces
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
On Relevant Dimensions in Kernel Feature Spaces
The Journal of Machine Learning Research
Transfer bounds for linear feature learning
Machine Learning
Adaptive kernel principal component analysis
Signal Processing
Detecting influential observations in Kernel PCA
Computational Statistics & Data Analysis
K-dimensional coding schemes in Hilbert spaces
IEEE Transactions on Information Theory
A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis
The Journal of Machine Learning Research
The Sample Complexity of Dictionary Learning
The Journal of Machine Learning Research
Model selection in kernel methods based on a spectral analysis of label information
DAGM'06 Proceedings of the 28th conference on Pattern Recognition
Hi-index | 0.06 |
The main goal of this paper is to prove inequalities on the reconstruction error for kernel principal component analysis. With respect to previous work on this topic, our contribution is twofold: (1) we give bounds that explicitly take into account the empirical centering step in this algorithm, and (2) we show that a "localized" approach allows to obtain more accurate bounds. In particular, we show faster rates of convergence towards the minimum reconstruction error; more precisely, we prove that the convergence rate can typically be faster than n 驴1/2. We also obtain a new relative bound on the error.A secondary goal, for which we present similar contributions, is to obtain convergence bounds for the partial sums of the biggest or smallest eigenvalues of the kernel Gram matrix towards eigenvalues of the corresponding kernel operator. These quantities are naturally linked to the KPCA procedure; furthermore these results can have applications to the study of various other kernel algorithms.The results are presented in a functional analytic framework, which is suited to deal rigorously with reproducing kernel Hilbert spaces of infinite dimension.