Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, V. 13)
Adaptive algorithms for first principal eigenvector computation
Neural Networks
Adaptive Principal component EXtraction (APEX) and applications
IEEE Transactions on Signal Processing
Global convergence of Oja's subspace algorithm for principal component extraction
IEEE Transactions on Neural Networks
Against the convergence of the minor component analysis neurons
IEEE Transactions on Neural Networks
Robust recursive least squares learning algorithm for principal component analysis
IEEE Transactions on Neural Networks
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
A class of learning algorithms for principal component analysis and minor component analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
The MCA EXIN neuron for the minor component analysis
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
A new simple ∞OH neuron model as a biologically plausible principal component analyzer
IEEE Transactions on Neural Networks
Coupled principal component analysis
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
Global analysis of Oja's flow for neural networks
IEEE Transactions on Neural Networks
Principal component extraction using recursive least squares learning
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Energy function for the one-unit Oja algorithm
IEEE Transactions on Neural Networks
Stability and Chaos of a Class of Learning Algorithms for ICA Neural Networks
Neural Processing Letters
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
IEEE Transactions on Signal Processing
A family of fuzzy learning algorithms for robust principal component analysis neural networks
IEEE Transactions on Fuzzy Systems
Computers & Mathematics with Applications
Adaptive multiple minor directions extraction in parallel using a PCA neural network
Theoretical Computer Science
Hi-index | 5.23 |
A non-zero-approaching adaptive learning rate is proposed to guarantee the global convergence of Oja's principal component analysis (PCA) learning algorithm. Most of the existing adaptive learning rates for Oja's PCA learning algorithm are required to approach zero as the learning step increases. However, this is not practical in many applications due to the computational round-off limitations and tracking requirements. The proposed adaptive learning rate overcomes this shortcoming. The learning rate converges to a positive constant, thus it increases the evolution rate as the learning step increases. This is different from learning rates which approach zero which slow the convergence considerably and increasingly with time. Rigorous mathematical proofs for global convergence of Oja's algorithm with the proposed learning rate are given in detail via studying the convergence of an equivalent deterministic discrete time (DDT) system. Extensive simulations are carried out to illustrate and verify the theory derived. Simulation results show that this adaptive learning rate is more suitable for Oja's PCA algorithm to be used in an online learning situation.