Elements of information theory
Elements of information theory
A sharp concentration inequality with application
Random Structures & Algorithms
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Prediction, Learning, and Games
Prediction, Learning, and Games
Minimax redundancy for the class of memoryless sources
IEEE Transactions on Information Theory
Redundancy rates for renewal and other processes
IEEE Transactions on Information Theory - Part 2
A general minimax result for relative entropy
IEEE Transactions on Information Theory
The minimum description length principle in coding and modeling
IEEE Transactions on Information Theory
Asymptotic minimax regret for data compression, gambling, and prediction
IEEE Transactions on Information Theory
Universal compression of memoryless sources over unknown alphabets
IEEE Transactions on Information Theory
Speaking of infinity [i.i.d. strings]
IEEE Transactions on Information Theory
Precise minimax redundancy and regret
IEEE Transactions on Information Theory
On the MDL principle for i.i.d. sources with large alphabets
IEEE Transactions on Information Theory
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 754.84 |
This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail.