
Gradient Sampling Methods with Inexact Subproblem Solutions and Gradient Aggregation
Author(s) -
Frank E. Curtis,
Minhan Li
Publication year - 2022
Publication title -
informs journal on optimization
Language(s) - English
Resource type - Journals
eISSN - 2575-1492
pISSN - 2575-1484
DOI - 10.1287/ijoo.2022.0073
Subject(s) - convergence (economics) , mathematical optimization , trust region , minification , gradient method , mathematics , sampling (signal processing) , scheme (mathematics) , proximal gradient methods , quadratic equation , computer science , regular polygon , convex optimization , mathematical analysis , geometry , computer security , radius , filter (signal processing) , economics , computer vision , economic growth
Gradient sampling (GS) methods for the minimization of objective functions that may be nonconvex and/or nonsmooth are proposed, analyzed, and tested. One of the most computationally expensive components of contemporary GS methods is the need to solve a convex quadratic subproblem in each iteration. By contrast, the methods proposed in this paper allow the use of inexact solutions of these subproblems, which, as proved in the paper, can be incorporated without the loss of theoretical convergence guarantees. Numerical experiments show that, by exploiting inexact subproblem solutions, one can consistently reduce the computational effort required by a GS method. Additionally, a strategy is proposed for aggregating gradient information after a subproblem is solved (potentially inexactly) as has been exploited in bundle methods for nonsmooth optimization. It is proved that the aggregation scheme can be introduced without the loss of theoretical convergence guarantees. Numerical experiments show that incorporating this gradient aggregation approach can also reduce the computational effort required by a GS method.