Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
A Database for Handwritten Text Recognition Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Some Equivalences between Kernel Methods and Information Theoretic Methods
Journal of VLSI Signal Processing Systems
Statistical properties of kernel principal component analysis
Machine Learning
Dimensionality reduction and generalization
Proceedings of the 24th international conference on Machine learning
On Relevant Dimensions in Kernel Feature Spaces
The Journal of Machine Learning Research
Variance inflation in high dimensional Support Vector Machines
Pattern Recognition Letters
Hi-index | 0.00 |
Small sample high-dimensional principal component analysis (PCA) suffers from variance inflation and lack of generalizability. It has earlier been pointed out that a simple leave-one-out variance renormalization scheme can cure the problem. In this paper we generalize the cure in two directions: First, we propose a computationally less intensive approximate leave-one-out estimator, secondly, we show that variance inflation is also present in kernel principal component analysis (kPCA) and we provide a non-parametric renormalization scheme which can quite efficiently restore generalizability in kPCA. As for PCA our analysis also suggests a simplified approximate expression.