Sequential Extraction of Minor Components
Neural Processing Letters
On a Class of Orthonormal Algorithms for Principal and Minor Subspace Tracking
Journal of VLSI Signal Processing Systems
Modified Oja‘s Algorithms For Principal Subspace and Minor Subspaceextraction
Neural Processing Letters
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Theoretical Computer Science
Global Convergence of a PCA Learning Algorithm with a Constant Learning Rate
Computers & Mathematics with Applications
An Efficient Adaptive Minor Subspace Extraction Using Exact Nested Orthogonal Complement Structure
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Oja's algorithm for graph clustering and Markov spectral decomposition
Proceedings of the 3rd International Conference on Performance Evaluation Methodologies and Tools
A unified learning algorithm to extract principal and minor components
Digital Signal Processing
Regression on Fixed-Rank Positive Semidefinite Matrices: A Riemannian Approach
The Journal of Machine Learning Research
Analysis of the sanger hebbian neural network
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Oja's algorithm for graph clustering, Markov spectral decomposition, and risk sensitive control
Automatica (Journal of IFAC)
Hi-index | 0.01 |
Oja's principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja's algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data covariance matrix is comprehensively addressed