z-logo
open-access-imgOpen Access
Representer Theorems for Sparsity-Promoting $\ell _{1}$ Regularization
Author(s) -
Michael Unser,
Julien Fageot,
Harshit Gupta
Publication year - 2016
Publication title -
ieee transactions on information theory
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.218
H-Index - 286
eISSN - 1557-9654
pISSN - 0018-9448
DOI - 10.1109/tit.2016.2590421
Subject(s) - communication, networking and broadcast technologies , signal processing and analysis
We present a theoretical analysis and comparison of the effect of l1 versus l2 regularization for the resolution of ill-posed linear inverse and/or compressed sensing problems. Our formulation covers the most general setting where the solution is specified as the minimizer of a convex cost functional. We derive a series of representer theorems that give the generic form of the solution depending on the type of regularization. We start with the analysis of the problem in finite dimensions and then extend our results to the infinite-dimensional spaces l2(Z) and l1(Z). We also consider the use of linear transformations in the form of dictionaries or regularization operators. In particular, we show that the l2 solution is forced to live in a predefined subspace that is intrinsically smooth and tied to the measurement operator. The l1 solution, on the other hand, is formed by adaptively selecting a subset of atoms in a dictionary that is specified by the regularization operator. Beside the proof that l1 solutions are intrinsically sparse, the main outcome of our investigation is that the use of l1 regularization is much more favorable for injecting prior knowledge: it results in a functional form that is independent of the system matrix, while this is not so in the l2 scenario.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom