Should we search for a global minimizer of least squares regularized with an ℓ0 penalty to get the exact solution of an under determined linear system?

  • Authors:
  • Mila Nikolova

  • Affiliations:
  • CMLA CNRS ENS Cachan UniverSud, Cachan Cedex, France

  • Venue:
  • SSVM'11 Proceedings of the Third international conference on Scale Space and Variational Methods in Computer Vision
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study objectives ${\mathcal{F}}_d$ combining a quadratic data-fidelity and an ℓ0 regularization. Data d are generated using a full-rank M ×N matrix A with NM. Our main results are listed below. Minimizers of ${\mathcal{F}}_d$ are strict if and only if length(support( )) $\leqslant M$ and the submatrix of A whose columns are indexed by support( ) is full rank. Their continuity in data is derived. Global minimizers are always strict. We adopt a weak assumption on A and show that it holds with probability one. Data read $d=A{\ddot{u}}$ where length(support( ${\ddot{u}}$ )) $\leqslant M-1$ and the submatrix whose columns are indexed by support( ${\ddot{u}}$ ) is full rank. Among all strict (local) minimizers of ${\mathcal{F}}_d$ with support shorter than M−1, the exact solution is the unique vector that cancels the residual. The claim is independent of the regularization parameter. This is usually a strict local minimizer where ${\mathcal{F}}_d$ does not reach its global minimum. Global minimization of ${\mathcal{F}}_d$ can then prevent the recovery of ${\ddot{u}}$ . A numerical example (A is 5 ×10) illustrates our main results.