Mutual information, variation, and Fano's inequality

  • Authors:
  • V. V. Prelov;E. C. Meulen

  • Affiliations:
  • Kharkevich Institute for Information Transmission Problems, RAS, Moscow, Russia;Katholieke Universiteit Leuven, Leuven, Belgium

  • Venue:
  • Problems of Information Transmission
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Some upper and lower bounds are obtained for the maximum of the absolute value of the difference between the mutual information |I(X; Y) 驴 I(X驴; Y驴)| of two pairs of discrete random variables (X, Y) and (X驴, Y驴) via the variational distance between the probability distributions of these pairs. In particular, the upper bound obtained here substantially generalizes and improves the upper bound of [1]. In some special cases, our upper and lower bounds coincide or are rather close. It is also proved that the lower bound is asymptotically tight in the case where the variational distance between (X, Y) and (X驴 Y驴) tends to zero.