z-logo
Premium
An increasing‐angle property of the conjugate gradient method and the implementation of large‐scale minimization algorithms with line searches
Author(s) -
Dai YuHong,
Martínez José Mario,
Yuan JinYun
Publication year - 2003
Publication title -
numerical linear algebra with applications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.02
H-Index - 53
eISSN - 1099-1506
pISSN - 1070-5325
DOI - 10.1002/nla.305
Subject(s) - conjugate gradient method , mathematics , nonlinear conjugate gradient method , iterated function , line search , line (geometry) , bounded function , quadratic equation , scale (ratio) , conjugate , gradient method , minification , algorithm , conjugate residual method , derivation of the conjugate gradient method , gradient descent , geometry , mathematical optimization , mathematical analysis , computer science , physics , computer security , quantum mechanics , machine learning , artificial neural network , radius
The search direction in unconstrained minimization algorithms for large‐scale problems is usually computed as an iterate of the preconditioned) conjugate gradient method applied to the minimization of a local quadratic model. In line‐search procedures this direction is required to satisfy an angle condition that says that the angle between the negative gradient at the current point and the direction is bounded away from π/2. In this paper, it is shown that the angle between conjugate gradient iterates and the negative gradient strictly increases as far as the conjugate gradient algorithm proceeds. Therefore, the interruption of the conjugate gradient sub‐algorithm when the angle condition does not hold is theoretically justified. Copyright © 2002 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here