z-logo
open-access-imgOpen Access
Several Guaranteed Descent Conjugate Gradient Methods for Unconstrained Optimization
Author(s) -
Sanyang Liu,
Yuanyuan Huang
Publication year - 2014
Publication title -
journal of applied mathematics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.307
H-Index - 43
eISSN - 1687-0042
pISSN - 1110-757X
DOI - 10.1155/2014/825958
Subject(s) - algorithm , computer science
This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent condition gkTdk≤-1-1/4θkgk2  θk>1/4 and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom