Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel Principal Component Analysis
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Expectation-Maximization Approach to Nonlinear Component Analysis
Neural Computation
Incremental Kernel SVD for Face Recognition with Image Sets
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Fast Iterative Kernel Principal Component Analysis
The Journal of Machine Learning Research
Kernel least mean square algorithm with constrained growth
Signal Processing
Adaptive kernel principal component analysis
Signal Processing
Generalized KPCA by adaptive rules in feature space
International Journal of Computer Mathematics
Dimensionality reduction by Mixed Kernel Canonical Correlation Analysis
Pattern Recognition
l2,1-norm regularized discriminative feature selection for unsupervised learning
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Incremental Kernel Principal Component Analysis
IEEE Transactions on Image Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
KPCA can extract nonlinear features of data set. However, its efficiency is in inverse proportion to the size of the training sample set. In this paper, we proposed an adaptive kernel subspace method to extract features efficiently. The method is methodologically consistent with KPCA, and can improve the efficiency by adaptively selecting the spanning vectors of the kernel principal components, meanwhile, not affect the accuracy much. Experiments on two-dimensional data, MNIST dataset and USPS dataset show that the feature extraction method is more efficient than that associated with KPCA and reference methods.