Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Space-Time Coding
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
Asymptotic achievability of the Cramér-Rao bound for noisy compressive sampling
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
Subspace pursuit for compressive sensing signal reconstruction
IEEE Transactions on Information Theory
Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting
IEEE Transactions on Information Theory
Necessary and sufficient conditions for sparsity pattern recovery
IEEE Transactions on Information Theory
A Frame Construction and a Universal Distortion Bound for Sparse Representations
IEEE Transactions on Signal Processing
Decoding by linear programming
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Asymptotic achievability of the Cramér-Rao bound for noisy compressive sampling
IEEE Transactions on Signal Processing
Number of measurements in sparse signal recovery
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
A note on optimal support recovery in compressed sensing
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
SPARLS: the sparse RLS algorithm
IEEE Transactions on Signal Processing
Information theoretic bounds for compressed sensing
IEEE Transactions on Information Theory
Information theory applications in error bit rate analysis of digital image watermarking
Computers and Electrical Engineering
Hi-index | 754.90 |
In this paper, we study the number of measurements required to recover a sparse signal in CM with L nonzero coefficients from compressed samples in the presence of noise. We consider a number of different recovery criteria, including the exact recovery of the support of the signal, which was previously considered in the literature, as well as new criteria for the recovery of a large fraction of the support of the signal, and the recovery of a large fraction of the energy of the signal. For these recovery criteria, we prove that O(L) (an asymptotically linear multiple of L) measurements are necessary and sufficient for signal recovery, whenever L grows linearly as a function of M. This improves on the existing literature that is mostly focused on variants of a specific recovery algorithm based on convex programming, for which O(Llog(M - L) measurements are required. In contrast, the implementation of our proof method would have a higher complexity. We also show that O(Llog(M - L)) measurements are required in the sublinear regime (L = o(M)). For our sufficiency proofs, we introduce a Shannon-theoretic decoder based on joint typicality, which allows error events to be defined in terms of a single random variable in contrast to previous information-theoretic work, where comparison of random variables are required. We also prove concentration results for our error bounds implying that a randomly selected Gaussian matrix will suffice with high probability. For our necessity proofs, we rely on results from channel coding and rate-distortion theory.