Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems Under Limited Memory. Part II: Seeking Many Eigenvalues

  • Authors:
  • Andreas Stathopoulos;James R. McCombs

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Scientific Computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.02

Visualization

Abstract

In a recent companion paper, we proposed two methods, GD+k and JDQMR, as nearly optimal methods for finding one eigenpair of a real symmetric matrix. In this paper, we seek nearly optimal methods for a large number, $nev$, of eigenpairs that work with a search space whose size is $O(1)$, independent from $nev$. The motivation is twofold: avoid the additional $O(nev N)$ storage and the $O(nev^2N)$ iteration costs. First, we provide an analysis of the oblique projectors required in the Jacobi-Davidson method and identify ways to avoid them during the inner iterations, either completely or partially. Second, we develop a comprehensive set of performance models for GD+k, Jacobi-Davidson type methods, and ARPACK. Based both on theoretical arguments and on our models we argue that any eigenmethod with $O(1)$ basis size, preconditioned or not, will be superseded asymptotically by Lanczos-type methods that use $O(nev)$ vectors in the basis. However, this may not happen until $nev O(1000)$. Third, we perform an extensive set of experiments with our methods and against other state-of-the-art software that validates our models and confirms our GD+k and JDQMR methods as nearly optimal within the class of O(1) basis size methods.