
Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization
Author(s) -
Hong Seng Sim,
Chuei Yee Chen,
Wah June Leong,
Jiao Li
Publication year - 2022
Publication title -
journal of industrial and management optimization
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.325
H-Index - 32
eISSN - 1553-166X
pISSN - 1547-5816
DOI - 10.3934/jimo.2021143
Subject(s) - line search , rank (graph theory) , conjugate gradient method , mathematics , mathematical optimization , eigenvalues and eigenvectors , convergence (economics) , matrix (chemical analysis) , backtracking , measure (data warehouse) , gradient method , computer science , combinatorics , physics , materials science , computer security , quantum mechanics , database , economics , radius , composite material , economic growth
This paper proposes a nonmonotone spectral gradient method for solving large-scale unconstrained optimization problems. The spectral parameter is derived from the eigenvalues of an optimally sized memoryless symmetric rank-one matrix obtained under the measure defined as a ratio of the determinant of updating matrix over its largest eigenvalue. Coupled with a nonmonotone line search strategy where backtracking-type line search is applied selectively, the spectral parameter acts as a stepsize during iterations when no line search is performed and as a milder form of quasi-Newton update when backtracking line search is employed. Convergence properties of the proposed method are established for uniformly convex functions. Extensive numerical experiments are conducted and the results indicate that our proposed spectral gradient method outperforms some standard conjugate-gradient methods.