Is Jacobi--Davidson Faster than Davidson?

  • Authors:
  • Yvan Notay

  • Affiliations:
  • -

  • Venue:
  • SIAM Journal on Matrix Analysis and Applications
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Davidson method is a popular technique to compute a few of the smallest (or largest) eigenvalues of a large sparse real symmetric matrix. It is effective when the matrix is nearly diagonal, that is, when the matrix of eigenvectors is close to the identity matrix. However, its convergence properties are not yet well understood, and neither is how it behaves compared to the more recent Jacobi--Davidson method, for which a proper convergence analysis exists. In this paper, we develop a new convergence analysis of the Davidson method. This analysis proves that the convergence is fast for nearly diagonal matrices when the method is initialized in the standard way. One may at this stage not expect any significant improvement by shifting to the Jacobi--Davidson method. On the other hand, the latter may be more effective for more general initial approximations. It is also best suited for matrices that are not nearly diagonal, thanks to the use of more sophisticated preconditioning and/or inner iterations.