A Modified Hybrid Conjugate Gradient Method for Unconstrained Optimization
Author(s) -
Minglei Fang,
Min Wang,
Min Sun,
Rong Chen
Publication year - 2021
Publication title -
journal of mathematics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.252
H-Index - 13
eISSN - 2314-4785
pISSN - 2314-4629
DOI - 10.1155/2021/5597863
Subject(s) - conjugate gradient method , nonlinear conjugate gradient method , conjugate residual method , derivation of the conjugate gradient method , mathematics , line search , gradient method , convergence (economics) , biconjugate gradient method , conjugate , mathematical optimization , line (geometry) , algorithm , gradient descent , computer science , mathematical analysis , artificial neural network , geometry , artificial intelligence , computer security , economics , radius , economic growth
The nonlinear conjugate gradient algorithms are a very effective way in solving large-scale unconstrained optimization problems. Based on some famous previous conjugate gradient methods, a modified hybrid conjugate gradient method was proposed. The proposed method can generate decent directions at every iteration independent of any line search. Under the Wolfe line search, the proposed method possesses global convergence. Numerical results show that the modified method is efficient and robust.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom