Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Fundamental limits of almost lossless analog compression
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
On the rate-distortion function of random vectors and stationary sources with mixed distributions
IEEE Transactions on Information Theory
Csiszar's cutoff rates for arbitrary discrete sources
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Generalized cutoff rates and Renyi's information measures
IEEE Transactions on Information Theory
Fundamental limits of almost lossless analog compression
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
Measuring statistical dependence via the mutual information dimension
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 754.84 |
In Shannon theory, lossless source coding deals with the optimal compression of discrete sources. Compressed sensing is a lossless coding strategy for analog sources by means of multiplication by real-valued matrices. In this paper we study almost lossless analog compression for analog memoryless sources in an information-theoretic framework, in which the comprEssor or decompressor is constrained by various regularity conditions, in particular linearity of the compressor and Lipschitz continuity of the decompressor. The fundamental limit is shown to be the information dimension proposed by Rényi in 1959.