Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
On Estimation of Information via Variation
Problems of Information Transmission
On inequalities between mutual information and variation
Problems of Information Transmission
Mutual information, variation, and Fano's inequality
Problems of Information Transmission
Refinements of Pinsker's inequality
IEEE Transactions on Information Theory
Estimating Mutual Information Via Kolmogorov Distance
IEEE Transactions on Information Theory
On computation of information via variation and inequalities for the entropy function
Problems of Information Transmission
Generalization of a Pinsker problem
Problems of Information Transmission
Hi-index | 0.00 |
We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1---3] to the multivariate case.