Theoretical Computer Science
Global Convergence of a PCA Learning Algorithm with a Constant Learning Rate
Computers & Mathematics with Applications
Convergence analysis of the OJAn MCA learning algorithm by the deterministic discrete time method
Theoretical Computer Science
An Efficient Measure of Signal Temporal Predictability for Blind Source Separation
Neural Processing Letters
A stable MCA learning algorithm
Computers & Mathematics with Applications
Stability and Chaos of a Class of Learning Algorithms for ICA Neural Networks
Neural Processing Letters
Oja's algorithm for graph clustering and Markov spectral decomposition
Proceedings of the 3rd International Conference on Performance Evaluation Methodologies and Tools
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
IEEE Transactions on Signal Processing
A family of fuzzy learning algorithms for robust principal component analysis neural networks
IEEE Transactions on Fuzzy Systems
On the discrete-time dynamics of a class of self-stabilizing MCA extraction algorithms
IEEE Transactions on Neural Networks
Stability and chaos analysis for an ICA algorithm
Computers & Mathematics with Applications
Adaptive multiple minor directions extraction in parallel using a PCA neural network
Theoretical Computer Science
Invariant set of weight of perceptron trained by perceptron training algorithm
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A novel unified SPM-ICA-PCA method for detecting epileptic activities in resting-state fMRI
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part II
On the discrete time dynamics of a self-stabilizing MCA learning algorithm
Mathematical and Computer Modelling: An International Journal
A Competitive Layer Model for Cellular Neural Networks
Neural Networks
Oja's algorithm for graph clustering, Markov spectral decomposition, and risk sensitive control
Automatica (Journal of IFAC)
Convergence analysis for feng's MCA neural network learning algorithm
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
The convergence of Oja's principal component analysis (PCA) learning algorithms is a difficult topic for direct study and analysis. Traditionally, the convergence of these algorithms is indirectly analyzed via certain deterministic continuous time (DCT) systems. Such a method will require the learning rate to converge to zero, which is not a reasonable requirement to impose in many practical applications. Recently, deterministic discrete time (DDT) systems have been proposed instead to indirectly interpret the dynamics of the learning algorithms. Unlike DCT systems, DDT systems allow learning rates to be constant (which can be a nonzero). This paper will provide some important results relating to the convergence of a DDT system of Oja's PCA learning algorithm. It has the following contributions: 1) A number of invariant sets are obtained, based on which we can show that any trajectory starting from a point in the invariant set will remain in the set forever. Thus, the nondivergence of the trajectories is guaranteed. 2) The convergence of the DDT system is analyzed rigorously. It is proven, in the paper, that almost all trajectories of the system starting from points in an invariant set will converge exponentially to the unit eigenvector associated with the largest eigenvalue of the correlation matrix. In addition, exponential convergence rate are obtained, providing useful guidelines for the selection of fast convergence learning rate. 3) Since the trajectories may diverge, the careful choice of initial vectors is an important issue. This paper suggests to use the domain of unit hyper sphere as initial vectors to guarantee convergence. 4) Simulation results will be furnished to illustrate the theoretical results achieved.