|Title||Jacobi correction equation, line search and conjugate gradients in Hermitian eigenvalue computation I: computing an extreme eigenvalue|
This paper is concerned with the convergence properties of iterative algorithms of conjugate gradient type applied to the computation of an extreme eigenvalue of a Hermitian operator via the optimization of the Rayleigh quotient functional. The algorithms in focus employ the line search for the extremum of the Rayleigh quotient functional in the direction that is a linear combination of the gradient and the previous search direction. An asymptotically explicit equation for the reduction in the eigenvalue error after the line search step as a function of the search direction is derived and applied to the analysis of the local convergence properties (i.e., the error reduction after an iteration) of the algorithms in question. The local (stepwise) asymptotic equivalence of various conjugate gradient algorithms and the generalized Davidson method is proved, and a new convergence estimate for conjugate gradient iterations is derived, showing the reduction in the eigenvalue error after any two consecutive iterations. The paper's analysis extensively employs remarkable properties of the operator of the so-called Jacobi orthogonal complement correction equation.
|Keywords||Hermitian eigenvalue computation, Jacobi orthogonal complement correction equation, conjugate gradients, convergence estimates, generalized Davidson method|
|Journal||SIAM Journal on Numerical Analysis|
|Journal citation||46 (5), pp. 2567-2592|
|Publisher||Society for Industrial and Applied Mathematics|
|Digital Object Identifier (DOI)||https://doi.org/10.1137/070688742|