Sparse and Smooth Signal Estimation: Convexification of L0 Formulations

6 Nov 2018  ·  Alper Atamturk, Andres Gomez, Shaoning Han ·

Signal estimation problems with smoothness and sparsity priors can be naturally modeled as quadratic optimization with $\ell_0$-"norm" constraints. Since such problems are non-convex and hard-to-solve, the standard approach is, instead, to tackle their convex surrogates based on $\ell_1$-norm relaxations. In this paper, we propose a new iterative (convex) conic quadratic relaxations that exploit not only the $\ell_0$-"norm" terms, but also the fitness and smoothness functions. The iterative convexification approach substantially closes the gap between the $\ell_0$-"norm" and its $\ell_1$ surrogate. These stronger relaxations lead to significantly better estimators than $\ell_1$-norm approaches and also allow one to utilize affine sparsity priors. In addition, the parameters of the model and the resulting estimators are easily interpretable. Experiments with a tailored Lagrangian decomposition method indicate that the proposed iterative convex relaxations \rev{yield solutions within 1\% of the exact $\ell_0$ approach, and can tackle instances with up to 100,000 variables under one minute.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here