Convergence of Polynomial Restart Krylov Methods for Eigenvalue Computations

  • Authors:
  • Christopher A. Beattie;Mark Embree;D. C. Sorensen

  • Affiliations:
  • -;-;-

  • Venue:
  • SIAM Review
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Krylov subspace methods have led to reliable and effective tools for resolving large-scale, non-Hermitian eigenvalue problems. Since practical considerations often limit the dimension of the approximating Krylov subspace, modern algorithms attempt to identify and condense significant components from the current subspace, encode them into a polynomial filter, and then restart the Krylov process with a suitably refined starting vector. In effect, polynomial filters dynamically steer low-dimensional Krylov spaces toward a desired invariant subspace through their action on the starting vector. The spectral complexity of nonnormal matrices makes convergence of these methods difficult to analyze, and these effects are further complicated by the polynomial filter process. The principal object of study in this paper is the angle an approximating Krylov subspace forms with a desired invariant subspace. Convergence analysis is posed in a geometric framework that is robust to eigenvalue ill-conditioning, yet remains relatively uncluttered. The bounds described here suggest that the sensitivity of desired eigenvalues exerts little influence on convergence, provided the associated invariant subspace is well-conditioned; ill-conditioning of unwanted eigenvalues plays an essential role. This framework also gives insight into the design of effective polynomial filters. Numerical examples illustrate the subtleties that arise when restarting non-Hermitian iterations.