Markov random field modeling in computer vision
Markov random field modeling in computer vision
Combined SVM-Based Feature Selection and Classification
Machine Learning
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
A Wavelet Tour of Signal Processing, Third Edition: The Sparse Way
Recovering sparse signals with a certain family of nonconvex penalties and DC programming
IEEE Transactions on Signal Processing
Optimization by Stochastic Continuation
SIAM Journal on Imaging Sciences
Signal Reconstruction From Noisy Random Projections
IEEE Transactions on Information Theory
A Stochastic Continuation Approach to Piecewise Constant Reconstruction
IEEE Transactions on Image Processing
Hi-index | 0.00 |
We study objectives ${\mathcal{F}}_d$ combining a quadratic data-fidelity and an ℓ0 regularization. Data d are generated using a full-rank M ×N matrix A with NM. Our main results are listed below. Minimizers of ${\mathcal{F}}_d$ are strict if and only if length(support( )) $\leqslant M$ and the submatrix of A whose columns are indexed by support( ) is full rank. Their continuity in data is derived. Global minimizers are always strict. We adopt a weak assumption on A and show that it holds with probability one. Data read $d=A{\ddot{u}}$ where length(support( ${\ddot{u}}$ )) $\leqslant M-1$ and the submatrix whose columns are indexed by support( ${\ddot{u}}$ ) is full rank. Among all strict (local) minimizers of ${\mathcal{F}}_d$ with support shorter than M−1, the exact solution is the unique vector that cancels the residual. The claim is independent of the regularization parameter. This is usually a strict local minimizer where ${\mathcal{F}}_d$ does not reach its global minimum. Global minimization of ${\mathcal{F}}_d$ can then prevent the recovery of ${\ddot{u}}$ . A numerical example (A is 5 ×10) illustrates our main results.