Matrix analysis
Elements of information theory
Elements of information theory
Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Necessary and sufficient conditions for sparsity pattern recovery
IEEE Transactions on Information Theory
Approximate sparse recovery: optimizing time and measurements
Proceedings of the forty-second ACM symposium on Theory of computing
Shannon-theoretic limits on noisy compressive sampling
IEEE Transactions on Information Theory
Information-theoretic limits on sparse signal recovery: dense versus sparse measurement matrices
IEEE Transactions on Information Theory
Information theoretic bounds for compressed sensing
IEEE Transactions on Information Theory
Approximate Sparse Recovery: Optimizing Time and Measurements
SIAM Journal on Computing
Hi-index | 755.08 |
The problem of sparsity pattern or support set recovery refers to estimating the set of nonzero coefficients of an unknown vector β* Ɛ Rp based on a set of n noisy observations. It arises in a variety of settings, including subset selection in regression, graphical model selection, signal denoising, compressive sensing, and constructive approximation. The sample complexity of a given method for subset recovery refers to the scaling of the required sample size n as a function of the signal dimension p, sparsity index k (number of non-zeroes in β*), as well as the minimum value βmin of β* over its support and other parameters of measurement matrix. This paper studies the information-theoretic limits of sparsity recovery: in particular, for a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurement matrices, we derive both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as a set of necessary conditions that any decoder, regardless of its computational complexity, must satisfy for exact support recovery. This analysis of fundamental limits complements our previous work on sharp thresholds for support set recovery over the same set of random measurement ensembles using the polynomial-time Lasso method (l1-constrained quadratic programming).