Adaptive algorithms for first principal eigenvector computation
Neural Networks
Adaptive Principal component EXtraction (APEX) and applications
IEEE Transactions on Signal Processing
Global convergence of Oja's subspace algorithm for principal component extraction
IEEE Transactions on Neural Networks
Against the convergence of the minor component analysis neurons
IEEE Transactions on Neural Networks
Robust recursive least squares learning algorithm for principal component analysis
IEEE Transactions on Neural Networks
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
The MCA EXIN neuron for the minor component analysis
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Coupled principal component analysis
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
Global analysis of Oja's flow for neural networks
IEEE Transactions on Neural Networks
Principal component extraction using recursive least squares learning
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Energy function for the one-unit Oja algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.09 |
In most of existing principal components analysis (PCA) learning algorithms, the learning rates are required to approach zero as learning step increases. However, in many practical applications, due to computational round-off limitations and tracking requirements, constant learning rates must be used. This paper proposes a PCA learning algorithm with a constant learning rate. It will prove via DDT (Deterministic Discrete Time) method that this PCA learning algorithm is globally convergent. Simulations are carried out to illustrate the theory.