On Pinsker's and Vajda's type inequalities for Csiszár's f-divergences

  • Authors:
  • Gustavo L. Gilardoni

  • Affiliations:
  • Department of Statistics, Universidade de Brasília, Brasília, Brazil

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.84

Visualization

Abstract

Let D and V denote respectively Information Divergence and Total Variation Distance. Pinsker's and Vajda's inequalities are respectively D ≥ 1/2 V2 and D ≥ log 2+V/2-V - 2V/2+V. In this paper, several generalizations and improvements of these inequalities are established for wide classes of f-divergences. First, conditions on f are determined under which an f-divergence Df will satisfy Df ≥ cfV2 or Df ≥ c2,fV2+c4, fV4, where the constants cf, c2, f and c4, f are best possible. As a consequence, lower bounds in terms of V are obtained for many well known distance and divergence measures, including the χ2 and Hellinger's discrimination and the families of Tsallis' and Rényi's divergences. For instance, if D (α) (p||Q) = [α(α-1)-1[∫ pαq1-αdµ-1] and Iα(P||Q) = (α-1)-1 log [∫ pαq1-α dµ] are respectively the relative information of type α and the Rényi's information gain of order α, it is shown that D(α) ≥ 1/2V2 + 1/72 (α +1) (2 - α)V4 whenever -1 ≤ α ≤ 2, α ≠ 0,1 and that I α ≥ α/2V2 +1/36α(1+5α - 5α2)V4 for 0 V close to zero), lower bounds for Df which are accurate for both small and large variation (V close to two) are also obtained. In the special case of the information divergence they imply that D ≥ log 2/2-V - 2-V/2 log 2+V/2, which uniformly improves Vajda's inequality.