Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
Matrix computations (3rd ed.)
Candid Covariance-Free Incremental Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal component extraction using recursive least squares learning
IEEE Transactions on Neural Networks
Evolving fuzzy medical diagnosis of Pima Indians diabetes and of dermatological diseases
Artificial Intelligence in Medicine
Hi-index | 0.00 |
In this paper, we present a novel incremental algorithm for principal component analysis (PCA). The proposed algorithm is a kind of covariance-free type algorithm which requires less computation and storage space in finding out the eigenvectors, than other incremental PCA methods using a covariance matrix. The major contribution of this paper is to explicitly deal with the changing mean and to use a Gram-Schmidt Orthogonalization (GSO) for enforcing the orthogonality of the eigenvectors. As a result, more accurate eigenvectors can be found with this algorithm than other algorithms. The performance of the proposed algorithm is evaluated by experiments on the data sets with various properties and it is shown that the proposed method can find out the eigenvectors more closer to those of batch algorithm than the others.