Candid Covariance-Free Incremental Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
2005 Special issue: Incremental learning of feature space and classifier for face recognition
Neural Networks - 2005 Special issue: IJCNN 2005
Active visual learning and recognition using incremental kernel PCA
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Incremental linear discriminant analysis for classification of data streams
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A novel incremental principal component analysis and its application for face recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental Kernel Principal Component Analysis
IEEE Transactions on Image Processing
Incremental Learning of Chunk Data for Online Pattern Classification Systems
IEEE Transactions on Neural Networks
Adaptive incremental principal component analysis in nonstationary online learning environments
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Hi-index | 0.00 |
In this paper, we present a modified version of Incremental Kernel Principal Component Analysis (IKPCA) which was originally proposed by Takeuchi et al. as an online nonlinear feature extraction method. The proposed IKPCA learns a high-dimensional feature space incrementally by solving an eigenvalue problem whose matrix size is given by the power of the number of independent data. In the proposed IKPCA, independent data are used for calculating eigenvectors in a feature space, but they are selected in a low-dimensional eigen-feature space. Hence, the size of an eigenvalue problem is usually small, and this allows IKPCA to learn eigen-feature spaces very fast even though the eigenvalue decomposition has to be carried out at every learning stage. The proposed IKPCA consists of two learning phases: initial learning phase and incremental learning phase. In the former, some parameters are optimized and an initial eigen-feature space is computed by applying the conventional KPCA. In the latter, the eigen-feature space is incrementally updated whenever a new data is given. In the experiments, we evaluate the learning time and the approximation accuracies of eigenvectors and eigenvalues. The experimental results demonstrate that the proposed IKPCA learns eigen-feature spaces very fast with good approximation accuracy.