Information theoretic bounds for compressed sensing

  • Authors:
  • Shuchin Aeron;Venkatesh Saligrama;Manqi Zhao

  • Affiliations:
  • Department of Electrical and Computer Engineering, Boston University, Boston, MA;Department of Electrical and Computer Engineering, Boston University, Boston, MA;Department of Electrical and Computer Engineering, Boston University, Boston, MA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.84

Visualization

Abstract

In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, m, and SNR, to signal sparsity, k, distortion level, d, and signal dimension, n. We consider support errors in a worst-case setting. We employ different variations of Fano's inequality to derive necessary conditions on the number of measurements and SNR required for exact reconstruction. To derive sufficient conditions, we develop new insights on max-likelihood analysis based on a novel superposition property. In particular, this property implies that small support errors are the dominant error events. Consequently, our ML analysis does not suffer the conservatism of the union bound and leads to a tighter analysis of max-likelihood. These results provide order-wise tight bounds. For output noise models, we show that asymptotically an SNR of Θ(log(n)) together with Θ( k log(n/k)) measurements is necessary and sufficient for exact support recovery. Furthermore, if a small fraction of support errors can be tolerated, a constant SNR turns out to be sufficient in the linear sparsity regime. In contrast for input noise models, we show that support recovery fails if the number of measurements scales as o(n log(n)/SNR), implying poor compression performance for such cases. Motivated by the fact that the worst-case setup requires significantly high SNR and substantial number of measurements for input and output noise models, we consider a Bayesian setup. To derive necessary conditions, we develop novel extensions to Fano's inequality to handle continuous domains and arbitrary distortions. We then develop a new max-likelihood analysis over the set of rate distortion quantization points to characterize tradeoffs between mean-squared distortion and the number of measurements using rate-distortion theory. We show that with constant SNR the number of measurements scales linearly with the rate-distortion function of the sparse phenomena.