Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Kernel SVD for Face Recognition with Image Sets
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Statistical properties of kernel principal component analysis
Machine Learning
Fast Iterative Kernel Principal Component Analysis
The Journal of Machine Learning Research
Kernel PCA for similarity invariant shape recognition
Neurocomputing
Incremental Learning for Robust Visual Tracking
International Journal of Computer Vision
Kernel least mean square algorithm with constrained growth
Signal Processing
Recursive updating the eigenvalue decomposition of a covariancematrix
IEEE Transactions on Signal Processing
Adaptive eigendecomposition of data covariance matrices based onfirst-order perturbations
IEEE Transactions on Signal Processing
Incremental Kernel Principal Component Analysis
IEEE Transactions on Image Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Mercer kernel-based clustering in feature space
IEEE Transactions on Neural Networks
Classification of non-alcoholic beer based on aftertaste sensory evaluation by chemometric tools
Expert Systems with Applications: An International Journal
Multiple kernel local Fisher discriminant analysis for face recognition
Signal Processing
Hi-index | 0.08 |
An adaptive kernel principal component analysis (AKPCA) method, which has the flexibility to accurately track the kernel principal components (KPC), is presented. The contribution of this paper may be divided into two parts. First, KPC are recursively formulated to overcome the batch nature of standard kernel principal component analysis (KPCA). This formulation is derived from the recursive eigendecomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. Second, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. In this adaptive method, the KPC is adaptively adjusted without re-eigendecomposing the kernel Gram matrix. The proposed method not only maintains constant update speed and memory usage as the data-size increases, but also alleviates sub-optimality of the KPCA method for non-stationary data. Experiments for simulation data and real applications are detailed to assess the utility of the proposed method. The results demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.