
A new family of conjugate gradient coefficient with application
Author(s) -
Norrlaili Shapiee,
Mohd Rivaie,
Mustafa Mamat,
Puspa Liza Ghazali
Publication year - 2018
Publication title -
international journal of engineering and technology
Language(s) - English
Resource type - Journals
ISSN - 2227-524X
DOI - 10.14419/ijet.v7i3.28.20962
Subject(s) - conjugate gradient method , line search , convergence (economics) , conjugate residual method , computer science , scale (ratio) , nonlinear conjugate gradient method , derivation of the conjugate gradient method , mathematical optimization , field (mathematics) , conjugate , gradient method , mathematics , algorithm , gradient descent , artificial intelligence , mathematical analysis , physics , computer security , quantum mechanics , economics , artificial neural network , pure mathematics , radius , economic growth
Conjugate gradient (CG) methods are famous for their utilization in solving unconstrained optimization problems, particularly for large scale problems and have become more intriguing such as in engineering field. In this paper, we propose a new family of CG coefficient and apply in regression analysis. The global convergence is established by using exact and inexact line search. Numerical results are presented based on the number of iterations and CPU time. The findings show that our method is more efficient in comparison to some of the previous CG methods for a given standard test problems and successfully solve the real life problem.