Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On Estimation of Information via Variation
Problems of Information Transmission
Mutual information, variation, and Fano's inequality
Problems of Information Transmission
Mutual information of several random variables and its estimation via variation
Problems of Information Transmission
On computation of information via variation and inequalities for the entropy function
Problems of Information Transmission
Generalization of a Pinsker problem
Problems of Information Transmission
Hi-index | 0.00 |
We continue studying the relationship between mutual information and variational distance started in Pinsker's paper [1], where an upper bound for the mutual information via variational distance was obtained. We present a simple lower bound, which in some cases is optimal or asymptotically optimal. A uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the cases where the variational distance tends either to zero or to its maximum value.