Premium
New results on the convergence of the conjugate gradient method
Author(s) -
Bouyouli R.,
Meurant G.,
Smoch L.,
Sadok H.
Publication year - 2009
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.618
Subject(s) - conjugate gradient method , mathematics , conjugate residual method , lanczos resampling , inverse , norm (philosophy) , mathematical proof , derivation of the conjugate gradient method , convergence (economics) , residual , positive definite matrix , nonlinear conjugate gradient method , matrix (chemical analysis) , algorithm , gradient descent , computer science , eigenvalues and eigenvectors , law , physics , geometry , materials science , quantum mechanics , machine learning , artificial neural network , political science , economics , composite material , economic growth
This paper is concerned with proving theoretical results related to the convergence of the conjugate gradient (CG) method for solving positive definite symmetric linear systems. Considering the inverse of the projection of the inverse of the matrix, new relations for ratios of the A ‐norm of the error and the norm of the residual are provided, starting from some earlier results of Sadok ( Numer. Algorithms 2005; 40 :201–216). The proofs of our results rely on the well‐known correspondence between the CG method and the Lanczos algorithm. Copyright © 2008 John Wiley & Sons, Ltd.