Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Improved Algorithm for Kernel Principal Component Analysis
Neural Processing Letters
An Expectation-Maximization Approach to Nonlinear Component Analysis
Neural Computation
Fast Iterative Kernel Principal Component Analysis
The Journal of Machine Learning Research
An improved kernel principal component analysis for large-scale data set
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part II
Hi-index | 0.00 |
Kernel Principal Component Analysis (KPCA) is a nonlinear feature extraction approach, which generally needs to eigen-decom pose the kernel matrix. But the size of kernel matrix scales with the number of data points, it is infeasible to store and compute the kernel matrix when faced with the large-scale data set. To overcome computational and storage problem for large-scale data set, a new framework, Matrix-based Kernel Principal Component Analysis (M-KPCA), is proposed. By dividing the large scale data set into small subsets, we could treat the autocorrelation matrix of each subset as the special computational unit. A novel polynomial-matrix kernel function is adopted to compute the similarity between the data matrices in place of vectors. It is also proved that the polynomial kernel is the extreme case of the polynomial-matrix one. The proposed M-KPCA can greatly reduce the size of kernel matrix, which makes its computation possible. The effectiveness is demonstrated by the experimental results on the artificial and real data set.