On convergence analysis of dual proximal-gradient methods with approximate gradient for a class of nonsmooth convex minimization problems
Author(s) -
Sanming Liu,
Zhijie Wang,
Chongyang Liu
Publication year - 2015
Publication title -
journal of industrial and management optimization
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.325
H-Index - 32
eISSN - 1553-166X
pISSN - 1547-5816
DOI - 10.3934/jimo.2016.12.389
Subject(s) - proximal gradient methods , mathematics , convex function , sequence (biology) , function (biology) , regular polygon , rate of convergence , gradient method , convex optimization , combinatorics , mathematical analysis , mathematical optimization , computer science , geometry , key (lock) , evolutionary biology , biology , genetics , computer security
In this paper, we consider the problem of minimizing a nonsmooth convex objective which is the sum of a proper, nonsmooth, closed, strongly convex extend real-valued function with a proper, nonsmooth, closed, convex extend real-valued function which is a composition of a proper closed convex function and a nonzero affine map. We first establish its dual problem which consists of minimizing the sum of a smooth convex function with a closed proper nonsmooth convex function. Then we apply first order proximal gradient methods on the dual problem, where an error is present in the calculation of the gradient of the smooth term. Further we present a dual proximal-gradient methods with approximate gradient. We show that when the errors are summable although the dual lowest objective function sequence generated by the proximal-gradient method with the errors converges to the optimal value with the rate $O(\frac{1}{k})$, the rate of convergence of the primal sequence is of the order $O(\frac{1}{\sqrt{k}}$).
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom