z-logo
open-access-imgOpen Access
Two-versions of descent conjugate gradient methods for large-scale unconstrained optimization
Author(s) -
Hawraz N. Jabbar,
Basim A. Hassan
Publication year - 2021
Publication title -
indonesian journal of electrical engineering and computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.241
H-Index - 17
eISSN - 2502-4760
pISSN - 2502-4752
DOI - 10.11591/ijeecs.v22.i3.pp1643-1649
Subject(s) - conjugate gradient method , hessian matrix , nonlinear conjugate gradient method , mathematics , gradient descent , taylor series , derivation of the conjugate gradient method , diagonal , gradient method , scale (ratio) , conjugate , descent (aeronautics) , conjugate residual method , diagonal matrix , mathematical optimization , algorithm , computer science , mathematical analysis , artificial intelligence , geometry , artificial neural network , physics , quantum mechanics , engineering , aerospace engineering
The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here