Algorithms and Implementation Architectures for Hebbian Neural Networks
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Concise Coupled Neural Network Algorithm for Principal Component Analysis
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
A comparative study on diabetes disease diagnosis using neural networks
Expert Systems with Applications: An International Journal
Modeling and monitoring for handling nonlinear dynamic processes
Information Sciences: an International Journal
Hi-index | 0.00 |
Principal component analysis (PCA) is one of the most general purpose feature extraction methods. A variety of learning algorithms for PCA has been proposed. Many conventional algorithms, however, will either diverge or converge very slowly if learning rate parameters are not properly chosen. In this paper, an adaptive learning algorithm (ALA) for PCA is proposed. By adaptively selecting the learning rate parameters, we show that the m weight vectors in the ALA converge to the first m principle component vectors with almost the same rates. Comparing with the Sanger's generalized Hebbian algorithm (GHA), the ALA can quickly find the desired principal component vectors while the GHA fails to do so. Finally, simulation results are also included to illustrate the effectiveness of the ALA