z-logo
open-access-imgOpen Access
A Descent Dai-Liao Conjugate Gradient Method Based on a Modified Secant Equation and Its Global Convergence
Author(s) -
Ioannis E. Livieris,
Panagiotis Pintelas
Publication year - 2012
Publication title -
isrn computational mathematics
Language(s) - English
Resource type - Journals
ISSN - 2090-7842
DOI - 10.5402/2012/435495
Subject(s) - conjugate gradient method , nonlinear conjugate gradient method , mathematics , gradient descent , descent (aeronautics) , convergence (economics) , conjugate residual method , line search , gradient method , secant method , derivation of the conjugate gradient method , line (geometry) , function (biology) , mathematical optimization , mathematical analysis , newton's method , computer science , geometry , artificial intelligence , artificial neural network , computer security , economic growth , aerospace engineering , engineering , biology , quantum mechanics , evolutionary biology , economics , physics , nonlinear system , radius
We propose a conjugate gradient method which is based on the study of the Dai-Liao conjugate gradient method. An important property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. Moreover, it achieves a high-order accuracy in approximating the second-order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. (2010). Under mild conditions, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Numerical experiments are also presented.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom