On Estimation of Information via Variation
Problems of Information Transmission
On inequalities between mutual information and variation
Problems of Information Transmission
Mutual information, variation, and Fano's inequality
Problems of Information Transmission
Mutual information of several random variables and its estimation via variation
Problems of Information Transmission
Hi-index | 0.00 |
This paper supplements the author's paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.