Abstract | The generalized Davidson (GD) method can be viewed as a generalization of the preconditioned steepest descent (PSD) method for solving symmetric eigenvalue problems. There are two aspects of this generalization. The most obvious one is that in the GD method the new approximation is sought in a larger subspace, namely the one that spans all the previous approximate eigenvectors, in addition to the current one and the preconditioned residual thereof. Another aspect relates to the preconditioning. Most of the available results for the PSD method are associated with the same view on preconditioning as in the case of linear systems. Consequently, they fail to detect the superlinear convergence for certain "ideal" preconditioners, such as the one corresponding to the "exact" version of the Jacobi--Davidson method---one of the most familiar instances of the GD method. Focusing on the preconditioning aspect, this paper advocates an alternative approach to measuring the quality of preconditioning for eigenvalue problems and presents corresponding non-asymptotic convergence estimates for the GD method in general and Jacobi--Davidson method in particular that correctly detect known cases of the superlinear convergence. |
---|