Premium
On a generalized conjugate gradient orthogonal residual method
Author(s) -
Axelsson Owe,
Makarov M.
Publication year - 1995
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.1680020507
Subject(s) - residual , mathematics , conjugate gradient method , generalized minimal residual method , conjugate residual method , norm (philosophy) , computation , set (abstract data type) , matrix (chemical analysis) , rate of convergence , convergence (economics) , mathematical optimization , algorithm , computer science , gradient descent , key (lock) , materials science , machine learning , artificial neural network , political science , law , composite material , programming language , computer security , economics , economic growth
To solve a linear system of equations with a generally nonsymmetric matrix, a generalized conjugate gradientorthogonal residual method is presented. The method uses all previous search directions (or a truncated set of them) at each step but, contrary to standard implementations of similar methods, it requires storage of only one set with a linearly growing number of vectors (or the number in the truncated set). Furthermore, there is only one vector (the residual), which must be updated using all the vectors in this set, at each step. In this respect it is similar to the popular GMRES method but it has the additional advantage that it can stop at any stage when the norm of the residual is sufficiently small and no extra computation is needed to compute this norm. Furthermore, the new method can be truncated. The rate of convergence of the method is also discussed.