Elements of information theory
Elements of information theory
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Random matrix theory and wireless communications
Communications and Information Theory
Denoising by sparse approximation: error bounds based on rate-distortion theory
EURASIP Journal on Applied Signal Processing
Rate-Distortion Bounds for Sparse Approximation
SSP '07 Proceedings of the 2007 IEEE/SP 14th Workshop on Statistical Signal Processing
IEEE Transactions on Information Theory
Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting
IEEE Transactions on Information Theory
Necessary and sufficient conditions for sparsity pattern recovery
IEEE Transactions on Information Theory
Shannon-theoretic limits on noisy compressive sampling
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Sphere-covering, measure concentration, and source coding
IEEE Transactions on Information Theory
Joint source-channel coding of a Gaussian mixture source over the Gaussian broadcast channel
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Joint Source–Channel Communication for Distributed Estimation in Sensor Networks
IEEE Transactions on Information Theory
Hi-index | 754.84 |
In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, m, and SNR, to signal sparsity, k, distortion level, d, and signal dimension, n. We consider support errors in a worst-case setting. We employ different variations of Fano's inequality to derive necessary conditions on the number of measurements and SNR required for exact reconstruction. To derive sufficient conditions, we develop new insights on max-likelihood analysis based on a novel superposition property. In particular, this property implies that small support errors are the dominant error events. Consequently, our ML analysis does not suffer the conservatism of the union bound and leads to a tighter analysis of max-likelihood. These results provide order-wise tight bounds. For output noise models, we show that asymptotically an SNR of Θ(log(n)) together with Θ( k log(n/k)) measurements is necessary and sufficient for exact support recovery. Furthermore, if a small fraction of support errors can be tolerated, a constant SNR turns out to be sufficient in the linear sparsity regime. In contrast for input noise models, we show that support recovery fails if the number of measurements scales as o(n log(n)/SNR), implying poor compression performance for such cases. Motivated by the fact that the worst-case setup requires significantly high SNR and substantial number of measurements for input and output noise models, we consider a Bayesian setup. To derive necessary conditions, we develop novel extensions to Fano's inequality to handle continuous domains and arbitrary distortions. We then develop a new max-likelihood analysis over the set of rate distortion quantization points to characterize tradeoffs between mean-squared distortion and the number of measurements using rate-distortion theory. We show that with constant SNR the number of measurements scales linearly with the rate-distortion function of the sparse phenomena.