Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On Estimation of Information via Variation
Problems of Information Transmission
On inequalities between mutual information and variation
Problems of Information Transmission
Mutual information, variation, and Fano's inequality
Problems of Information Transmission
Mutual information of several random variables and its estimation via variation
Problems of Information Transmission
Estimating Mutual Information Via Kolmogorov Distance
IEEE Transactions on Information Theory
Hi-index | 0.00 |
A generalization of a Pinsker problem [1] on estimation of mutual information via variation is considered. We obtain some upper and lower bounds for the maximum of the absolute value of the difference between the mutual information of several random variables via variational distance between the probability distributions of these random variables. In some cases, these bounds are optimal.