z-logo
open-access-imgOpen Access
A Relax Inexact Accelerated Proximal Gradient Method for the Constrained Minimization Problem of Maximum Eigenvalue Functions
Author(s) -
Wei Wang,
Shanghua Li,
Jingjing Gao
Publication year - 2014
Publication title -
journal of applied mathematics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.307
H-Index - 43
eISSN - 1687-0042
pISSN - 1110-757X
DOI - 10.1155/2014/749475
Subject(s) - algorithm , materials science , computer science
For constrained minimization problem of maximum eigenvalue functions, since the objective function is nonsmooth, we can use the approximate inexact accelerated proximal gradient (AIAPG) method (Wang et al., 2013) to solve its smooth approximation minimization problem. When we take the function g(X)=δΩ(X)  (Ω∶={X∈Sn:F(X)=b,X⪰0}) in the problem min{λmax(X)+g(X):X∈Sn}, where λmax(X) is the maximum eigenvalue function, g(X) is a proper lower semicontinuous convex function (possibly nonsmooth) and δΩ(X) denotes the indicator function. But the approximate minimizer generated by AIAPG method must be contained in Ω otherwise the method will be invalid. In this paper, we will consider the case where the approximate minimizer cannot be guaranteed in Ω. Thus we will propose two different strategies, respectively, constructing the feasible solution and designing a new method named relax inexact accelerated proximal gradient (RIAPG) method. It is worth mentioning that one advantage when compared to the former is that the latter strategy can overcome the drawback. The drawback is that the required conditions are too strict. Furthermore, the RIAPG method inherits the global iteration complexity and attractive computational advantage of AIAPG method

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom