z-logo
open-access-imgOpen Access
Fast Alternating Direction Optimization Methods
Author(s) -
Tom Goldstein,
Brendan O’Donoghue,
Simon Setzer,
Richard G. Baraniuk
Publication year - 2014
Publication title -
siam journal on imaging sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.944
H-Index - 71
ISSN - 1936-4954
DOI - 10.1137/120896219
Subject(s) - acceleration , gradient descent , convergence (economics) , minification , mathematical optimization , descent direction , convex function , descent (aeronautics) , computer science , mathematics , convex optimization , variety (cybernetics) , algorithm , regular polygon , artificial intelligence , artificial neural network , physics , geometry , classical mechanics , meteorology , economics , economic growth
Alternating direction methods are a common tool for general mathematical programming and optimization. These methods have become particularly important in the field of variational image processing, which frequently requires the minimization of nondifferentiable objectives. This paper considers accelerated (i.e., fast) variants of two common alternating direction methods: the alternating direction method of multipliers (ADMM) and the alternating minimization algorithm (AMA). The proposed acceleration is of the form first proposed by Nesterov for gradient descent methods. In the case that the objective function is strongly convex, global convergence bounds are provided for both classical and accelerated variants of the methods. Numerical examples are presented to demonstrate the superior performance of the fast methods for a wide variety of problems

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom