Statistical properties of kernel principal component analysis
Machine Learning
Accurate Error Bounds for the Eigenvalues of the Kernel Matrix
The Journal of Machine Learning Research
Dimensionality reduction and generalization
Proceedings of the 24th international conference on Machine learning
Unsupervised slow subspace-learning from stationary processes
Theoretical Computer Science
Generalization Bounds for K-Dimensional Coding Schemes in Hilbert Spaces
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
On Relevant Dimensions in Kernel Feature Spaces
The Journal of Machine Learning Research
Accuracy of suboptimal solutions to kernel principal component analysis
Computational Optimization and Applications
Transfer bounds for linear feature learning
Machine Learning
Oracle inequalities for support vector machines that are based on random entropy numbers
Journal of Complexity
Accurate Probabilistic Error Bound for Eigenvalues of Kernel Matrix
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
A New Canonical Correlation Analysis Algorithm with Local Discrimination
Neural Processing Letters
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
On Learning with Integral Operators
The Journal of Machine Learning Research
Compressed fisher linear discriminant analysis: classification of randomly projected data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Computers & Mathematics with Applications
On spectral windows in supervised learning from data
Information Processing Letters
K-dimensional coding schemes in Hilbert spaces
IEEE Transactions on Information Theory
The Sample Complexity of Dictionary Learning
The Journal of Machine Learning Research
Model selection in kernel methods based on a spectral analysis of label information
DAGM'06 Proceedings of the 28th conference on Pattern Recognition
Unsupervised slow subspace-learning from stationary processes
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Soft analyzer modeling for dearomatization unit using KPCR with online eigenspace decomposition
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Generalization bounds for subspace selection and hyperbolic PCA
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Hi-index | 754.90 |
In this paper, the relationships between the eigenvalues of the m×m Gram matrix K for a kernel κ(·,·) corresponding to a sample x1,...,xm drawn from a density p(x) and the eigenvalues of the corresponding continuous eigenproblem is analyzed. The differences between the two spectra are bounded and a performance bound on kernel principal component analysis (PCA) is provided showing that good performance can be expected even in very-high-dimensional feature spaces provided the sample eigenvalues fall sufficiently quickly.