z-logo
open-access-imgOpen Access
Global Convergence of a New Coefficient Nonlinear Conjugate Gradient Method
Author(s) -
Nur Syarafina Mohamed,
Mustafa Mamat,
Mohd Rivaie,
Shazlyn Milleana Shaharuddin
Publication year - 2018
Publication title -
indonesian journal of electrical engineering and computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.241
H-Index - 17
eISSN - 2502-4760
pISSN - 2502-4752
DOI - 10.11591/ijeecs.v11.i3.pp1188-1193
Subject(s) - conjugate gradient method , convergence (economics) , nonlinear conjugate gradient method , line search , nonlinear system , mathematics , mathematical optimization , scale (ratio) , gradient method , computer science , gradient descent , physics , artificial intelligence , artificial neural network , radius , computer security , quantum mechanics , economics , economic growth
Nonlinear conjugate gradient (CG) methods are widely used in optimization field due to its efficiency for solving a large scale unconstrained optimization problems. Many studies and modifications have been developed in order to improve the method. The method is known to possess sufficient descend condition and its global convergence properties under strong Wolfe-Powell search direction. In this paper, the new coefficient of CG method is presented. The global convergence and sufficient descend properties of the new coefficient are established by using strong Wolfe-Powell line search direction. Results show that the new coefficient is able to globally converge under certain assumptions and theories.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here