Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On Estimation of Information via Variation
Problems of Information Transmission
On inequalities between mutual information and variation
Problems of Information Transmission
Refinements of Pinsker's inequality
IEEE Transactions on Information Theory
Estimating Mutual Information Via Kolmogorov Distance
IEEE Transactions on Information Theory
Mutual information of several random variables and its estimation via variation
Problems of Information Transmission
On computation of information via variation and inequalities for the entropy function
Problems of Information Transmission
Generalization of a Pinsker problem
Problems of Information Transmission
Hi-index | 0.00 |
Some upper and lower bounds are obtained for the maximum of the absolute value of the difference between the mutual information |I(X; Y) 驴 I(X驴; Y驴)| of two pairs of discrete random variables (X, Y) and (X驴, Y驴) via the variational distance between the probability distributions of these pairs. In particular, the upper bound obtained here substantially generalizes and improves the upper bound of [1]. In some special cases, our upper and lower bounds coincide or are rather close. It is also proved that the lower bound is asymptotically tight in the case where the variational distance between (X, Y) and (X驴 Y驴) tends to zero.