Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The O.D. E. Method for Convergence of Stochastic Approximation and Reinforcement Learning
SIAM Journal on Control and Optimization
Segmentation Using Eigenvectors: A Unifying View
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
Global convergence of Oja's subspace algorithm for principal component extraction
IEEE Transactions on Neural Networks
Convergence analysis of a deterministic discrete time system of Oja's PCA learning algorithm
IEEE Transactions on Neural Networks
Distributed user profiling via spectral methods
Proceedings of the ACM SIGMETRICS international conference on Measurement and modeling of computer systems
Hi-index | 0.00 |
Given a positive definite matrix M and an integer Nm ≥ 1, Oja's subspace algorithm will provide convergent estimates of the first Nm eigenvalues of M along with the corresponding eigenvectors. It is a common approach to principal component analysis. This paper introduces a normalized stochastic-approximation implementation of Oja's subspace algorithm, as well as new applications to the spectral decomposition of a reversible Markov chain. Stability and convergence are established under conditions far milder than assumed in previous work. Applications to graph clustering and Markov spectral decomposition are surveyed, along with numerical results.