z-logo
open-access-imgOpen Access
A New Hybrid Conjugate Gradient Method with Guaranteed Descent for Unconstraint Optimization
Author(s) -
Basim A. Hassan
Publication year - 2018
Publication title -
al-mustansiriyah journal of science
Language(s) - English
Resource type - Journals
eISSN - 2521-3520
pISSN - 1814-635X
DOI - 10.23851/mjs.v28i3.114
Subject(s) - nonlinear conjugate gradient method , conjugate gradient method , derivation of the conjugate gradient method , gradient descent , convergence (economics) , conjugate residual method , gradient method , descent (aeronautics) , mathematical optimization , conjugate , computer science , mathematics , artificial neural network , artificial intelligence , mathematical analysis , engineering , economics , economic growth , aerospace engineering
The conjugate gradient method an efficient technique for solving the unconstrained optimization problem. In this paper, we propose a new hybrid nonlinear conjugate gradient methods, which have the descent at every iteration and globally convergence properties under certain conditions. The numerical results show that new hybrid method are efficient for the given test problems.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom