An atlas of functions
Sparse Approximate Solutions to Linear Systems
SIAM Journal on Computing
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
Rate-Distortion Bounds for Sparse Approximation
SSP '07 Proceedings of the 2007 IEEE/SP 14th Workshop on Statistical Signal Processing
IEEE Transactions on Information Theory
A sparsity detection framework for on-off random access channels
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting
IEEE Transactions on Information Theory
Greed is good: algorithmic results for sparse approximation
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Signal Reconstruction From Noisy Random Projections
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Compressed Sensing and Redundant Dictionaries
IEEE Transactions on Information Theory
On reducing the complexity of tone-reservation based PAPR reduction schemes by compressive sensing
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
Shannon-theoretic limits on noisy compressive sampling
IEEE Transactions on Information Theory
A multi-sensor compressed sensing receiver: performance bounds and simulated results
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
A note on optimal support recovery in compressed sensing
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
Information theoretic bounds for compressed sensing
IEEE Transactions on Information Theory
Hi-index | 754.96 |
The paper considers the problem of detecting the sparsity pattern of a k-sparse vector in Rn from m random noisy measurements. A new necessary condition on the number of measurements for asymptotically reliable detection with maximum-likelihood (ML) estimation and Gaussian measurement matrices is derived. This necessary condition for ML detection is compared against a sufficient condition for simple maximum correlation (MC) or thresholding algorithms. The analysis shows that the gap between thresholding and ML can be described by a simple expression in terms of the total signal-to-noise ratio (SNR), with the gap growing with increasing SNR. Thresholding is also compared against the more sophisticated Lasso and orthogonal matching pursuit (OMP) methods. At high SNRs, it is shown that the gap between Lasso and OMP over thresholding is described by the range of powers of the nonzero component values of the unknown signals. Specifically, the key benefit of Lasso and OMP over thresholding is the ability of Lasso and OMP to detect signals with relatively small components.