The interplay between entropy and variational distance

  • Authors:
  • Siu-Wai Ho;Raymond W. Yeung

  • Affiliations:
  • Institute for Telecommunications Research, University of South Australia, Adelaide, SA, Australia;Department of Information Engineering, The Chinese University of Hong Kong, N.T., Hong Kong

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.90

Visualization

Abstract

The relation between the Shannon entropy and variational distance, two fundamental and frequently-used quantities in information theory, is studied in this paper by means of certain bounds on the entropy difference between two probability distributions in terms of the variational distance between them and their alphabet sizes. We also show how to find the distribution achieving the minimum (or maximum) entropy among those distributions within a given variational distance from any given distribution. These results are applied to solve a number of problems that are of fundamental interest. For entropy estimation, we obtain an analytic formula for the confidence interval, solving a problem that has been opened for more than 30 years. For approximation of probability distributions, we find the minimum entropy difference between two distributions in terms of their alphabet sizes and the variational distance between them. In particular, we show that the entropy difference between two distributions that are close in variational distance can be arbitrarily large if the alphabet sizes of the two distributions are unconstrained. For random number generation, we characterize the tradeoff between the amount of randomness required and the distortion in terms of variation distance. New tools for non-convex optimization have been developed to establish the results in this paper.