Elements of information theory
Elements of information theory
Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Distributed sparse random projections for refinable approximation
Proceedings of the 6th international conference on Information processing in sensor networks
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
IEEE Transactions on Information Theory
Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
The Journal of Machine Learning Research
Compressed sensing construction of spectrum map for routing in cognitive radio networks
Wireless Communications & Mobile Computing
Sketching via hashing: from heavy hitters to compressed sensing to sparse fourier transform
Proceedings of the 32nd symposium on Principles of database systems
Hi-index | 754.84 |
We study the information-theoretic limits of exactly recovering the support set of a sparse signal, using noisy projections defined by various classes of measurement matrices. Our analysis is high-dimensional in nature, in which the number of observations n, the ambient signal dimension p, and the signal sparsity k are all allowed to tend to infinity in a general manner. This paper makes two novel contributions. First, we provide sharper necessary conditions for exact support recovery using general (including non-Gaussian) dense measurement matrices. Combined with previously known sufficient conditions, this result yields sharp characterizations of when the optimal decoder can recover a signal for various scalings of the signal sparsity k and sample size n, including the important special case of linear sparsity (k = Θ(p)) using a linear scaling of observations (n = Θ(p)). Our second contribution is to prove necessary conditions on the number of observations n required for asymptotically reliable recovery using a class of γ-sparsified measurement matrices, where the measurement sparsity parameter γ(n, p, k) ∈ (0, 1] corresponds to the fraction of nonzero entries per row. Our analysis allows general scaling of the quadruplet (n, p, k, γ), and reveals three different regimes, corresponding to whether measurement sparsity has no asymptotic effect, a minor effect, or a dramatic effect on the information-theoretic limits of the subset recovery problem.