Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel principal component analysis
Advances in kernel methods
A class of robust principal component vectors
Journal of Multivariate Analysis
Robust Principal Component Analysis with Adaptive Selection for Tuning Parameters
The Journal of Machine Learning Research
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices
Journal of Multivariate Analysis
Hi-index | 0.00 |
This letter discusses the robustness issue of kernel principal component analysis. A class of new robust procedures is proposed based on eigenvalue decomposition of weighted covariance. The proposed procedures will place less weight on deviant patterns and thus be more resistant to data contamination and model deviation. Theoretical influence functions are derived, and numerical examples are presented as well. Both theoretical and numerical results indicate that the proposed robust method outperforms the conventional approach in the sense of being less sensitive to outliers. Our robust method and results also apply to functional principal component analysis.