
A new modification nonlinear conjugate gradient method with strong wolf-powell line search
Author(s) -
Chergui Ahmed,
Tahar Bouali
Publication year - 2020
Publication title -
indonesian journal of electrical engineering and computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.241
H-Index - 17
eISSN - 2502-4760
pISSN - 2502-4752
DOI - 10.11591/ijeecs.v18.i1.pp525-532
Subject(s) - conjugate gradient method , line search , nonlinear conjugate gradient method , convergence (economics) , mathematics , gradient descent , line (geometry) , conjugate , span (engineering) , descent (aeronautics) , gradient method , derivation of the conjugate gradient method , conjugate residual method , mathematical optimization , algorithm , computer science , mathematical analysis , geometry , physics , artificial intelligence , engineering , computer security , civil engineering , meteorology , artificial neural network , economics , radius , economic growth
The conjugate gradient method has played a special role in solving large-scale unconstrained Optimization problems. In this paper, we propose a new family of CG coefficients that possess sufficient descent conditions and global convergence properties this CG method is similar to (Wei et al) [7]. Global convergence result is established under Strong Wolf-Powell line search. Numerical results to find the optimum solution of some test functions show the new proposed formula has the best result in CPU time and the number of iterations, and the number of gradient evaluations when it comparing with FR, PRP, DY, and WYL