Coding Theorems of Information Theory
Coding Theorems of Information Theory
Convergence properties of functional estimates for discrete distributions
Random Structures & Algorithms - Special issue on analysis of algorithms dedicated to Don Knuth on the occasion of his (100)8th birthday
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Information Theory and Network Coding
Information Theory and Network Coding
On the discontinuity of the Shannon information measures
IEEE Transactions on Information Theory
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
A framework for linear information inequalities
IEEE Transactions on Information Theory
A non-Shannon-type conditional inequality of information quantities
IEEE Transactions on Information Theory
On characterization of entropy function via information inequalities
IEEE Transactions on Information Theory
The method of types [information theory]
IEEE Transactions on Information Theory
Speaking of infinity [i.i.d. strings]
IEEE Transactions on Information Theory
Universal discrete denoising: known channel
IEEE Transactions on Information Theory
On the discontinuity of the Shannon information measures
IEEE Transactions on Information Theory
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 754.96 |
Strong typicality, which is more powerful for theorem proving than weak typicality, can be applied to finite alphabets only, while weak typicality can be applied to countable alphabets. In this paper, the relation between typicality and information divergence measures is discussed. The new definition of information divergence measure in this paper leads to the definition of a unified typicality for finite or countably infinite alphabets which is stronger than both weak typicality and strong typicality. Unified typicality retains the asymptotic equipartition property and the structural properties of strong typicality, and it can potentially be used to generalize those theorems which are previously established by strong typicality to countable alphabets. The applications in rate-distortion theory and multisource network coding problems are discussed.