Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Shannon-theoretic limits on noisy compressive sampling
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Signal Reconstruction From Noisy Random Projections
IEEE Transactions on Information Theory
The Cramér-Rao bound for estimating a sparse parameter vector
IEEE Transactions on Signal Processing
Shannon-theoretic limits on noisy compressive sampling
IEEE Transactions on Information Theory
A multi-sensor compressed sensing receiver: performance bounds and simulated results
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
An alternating minimization method for sparse channel estimation
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Hi-index | 35.75 |
We consider a model of the form y = Ax + n, where x ε CM is sparse with at most L nonzero coefficients in unknown locations, y ε CN is the observation vector, A ε CN×M is the measurement matrix and n ε CN is the Gaussian noise. We develop a Cramér-Rao bound on the mean squared estimation error of the nonzero elements of x, corresponding to the genie-aided estimator (GAE) which is provided with the locations of the nonzero elements of x. Intuitively, the mean squared estimation error of any estimator without the knowledge of the locations of the nonzero elements of x is no less than that of the GAE. Assuming that L/N is fixed, we establish the existence of an estimator that asymptotically achieves the Cramér-Rao bound without any knowledge of the locations of the nonzero elements of x as N → ∞, for A a random Gaussian matrix whose elements are drawn i.i.d. according to N (0, 1).