Mutual information, variation, and Fano's inequality
Problems of Information Transmission
Universal Estimation of Information Measures for Analog Sources
Foundations and Trends in Communications and Information Theory
Mutual information of several random variables and its estimation via variation
Problems of Information Transmission
Thinning, entropy, and the law of thin numbers
IEEE Transactions on Information Theory
On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences
IEEE Transactions on Information Theory
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Fano type quantum inequalities in terms of q-entropies
Quantum Information Processing
Hi-index | 754.96 |
Let V and D denote, respectively, total variation and divergence. We study lower bounds of D with V fixed. The theoretically best (i.e., largest) lower bound determines a function L=L(V), Vajda's (1970) tight lower bound. The main result is an exact parametrization of L. This leads to Taylor polynomials which are lower bounds for L, and thereby to extensions of the classical Pinsker (1960) inequality which has numerous applications, cf. Pinsker and followers.