Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
A new method for accelerating Arnoldi algorithms for large scale eigenproblems
Mathematics and Computers in Simulation
A block Chebyshev-Davidson method with inner-outer restart for large eigenvalue problems
Journal of Computational Physics
Shift-Invert Arnoldi's Method with Preconditioned Iterative Solves
SIAM Journal on Matrix Analysis and Applications
SIAM Journal on Scientific Computing
Hi-index | 0.02 |
This goal of this paper is to present an elegant relationship between an implicitly restarted Arnoldi method (IRAM) and nonstationary (subspace) simultaneous iteration. This relationship allows the geometric convergence theory developed for nonstationary simultaneous iteration due to Watkins and Elsner [Linear Algebra Appl., 143 (1991), pp. 19--47] to be used for analyzing the rate of convergence of an IRAM. We also comment on the relationship with other restarting schemes. A set of experiments demonstrates that implicit restarted methods can converge at a much faster rate than simultaneous iteration when iterating on a subspace of equal dimension.