Research Library

open-access-imgOpen AccessAn over-relaxed ADMM for separable convex programming and its applications to statistical learning
Author(s)
Renyuan Ni
Publication year2024
The alternating direction method of multipliers (ADMM) has been appliedsuccessfully in a broad spectrum of areas. Moreover, it was shown in theliterature that ADMM is closely related to the Douglas-Rachfordoperator-splitting method, and can be viewed as a special case of the proximalpoint algorithm (PPA). As is well known, the relaxed version of PPA not onlymaintains the convergence properties of PPA in theory, but also numericallyoutperforms the original PPA. Therefore, it is interesting to ask whether ADMMcan be accelerated by using the over-relaxation technique. In this paper, weanswer this question affirmatively by developing an over-relaxed ADMMintegrated with a criterion to decide whether a relaxation step is implementedin the iteration. The global convergence and $O(1/t)$ convergence rate of theover-relaxed ADMM are established. We also implement our proposed algorithm tosolve Lasso and sparse inverse covariance selection problems, and compare itsperformance with the relaxed customized ADMM in \cite{CGHY} and the classicalADMM. The results show that our algorithm substantially outperforms the other twomethods.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here