Asymptotic methods in statistical theory
Asymptotic methods in statistical theory
Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Convex Optimization
Very accurate posterior approximations based on finite mixtures of the hyperparameters conditionals
Computational Statistics & Data Analysis
Some inequalities relating different measures of divergence between two probability distributions
IEEE Transactions on Information Theory
Some inequalities for information divergence and related measures of discrimination
IEEE Transactions on Information Theory
Refinements of Pinsker's inequality
IEEE Transactions on Information Theory
Rate of convergence to Poisson law in terms of information divergence
IEEE Transactions on Information Theory
Crame´r-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information
IEEE Transactions on Information Theory
A distribution dependent refinement of Pinsker's inequality
IEEE Transactions on Information Theory
On Divergences and Informations in Statistics and Information Theory
IEEE Transactions on Information Theory
Generalized cutoff rates and Renyi's information measures
IEEE Transactions on Information Theory
Hi-index | 754.84 |
Let D and V denote respectively Information Divergence and Total Variation Distance. Pinsker's and Vajda's inequalities are respectively D ≥ 1/2 V2 and D ≥ log 2+V/2-V - 2V/2+V. In this paper, several generalizations and improvements of these inequalities are established for wide classes of f-divergences. First, conditions on f are determined under which an f-divergence Df will satisfy Df ≥ cfV2 or Df ≥ c2,fV2+c4, fV4, where the constants cf, c2, f and c4, f are best possible. As a consequence, lower bounds in terms of V are obtained for many well known distance and divergence measures, including the χ2 and Hellinger's discrimination and the families of Tsallis' and Rényi's divergences. For instance, if D (α) (p||Q) = [α(α-1)-1[∫ pαq1-αdµ-1] and Iα(P||Q) = (α-1)-1 log [∫ pαq1-α dµ] are respectively the relative information of type α and the Rényi's information gain of order α, it is shown that D(α) ≥ 1/2V2 + 1/72 (α +1) (2 - α)V4 whenever -1 ≤ α ≤ 2, α ≠ 0,1 and that I α ≥ α/2V2 +1/36α(1+5α - 5α2)V4 for 0 V close to zero), lower bounds for Df which are accurate for both small and large variation (V close to two) are also obtained. In the special case of the information divergence they imply that D ≥ log 2/2-V - 2-V/2 log 2+V/2, which uniformly improves Vajda's inequality.