Minimax redundancy for the class of memoryless sources

  • Authors:
  • Qun Xie;A. R. Barron

  • Affiliations:
  • Dept. of Stat., Yale Univ., New Haven, CT;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.90

Visualization

Abstract

Let Xn=(X1,...,Xn) be a memoryless source with unknown distribution on a finite alphabet of size k. We identify the asymptotic minimax coding redundancy for this class of sources, and provide a sequence of asymptotically minimax codes. Equivalently, we determine the limiting behavior of the minimax relative entropy minQXn maxpXn D(PXn||QXn), where the maximum is over all independent and identically distributed (i.i.d.) source distributions and the minimum is over all joint distributions. We show in this paper that the minimax redundancy minus ((k-1)/2) log(n/(2πe)) converges to log∫√(det I(θ))dθ=log (Γ(1/2)k/Γ(k/2)), where I(θ) is the Fisher information and the integral is over the whole probability simplex. The Bayes strategy using Jeffreys' prior is shown to be asymptotically maximin but not asymptotically minimax in our setting. The boundary risk using Jeffreys' prior is higher than that of interior points. We provide a sequence of modifications of Jeffreys' prior that put some prior mass near the boundaries of the probability simplex to pull down that risk to the asymptotic minimax level in the limit