A non-Shannon-type conditional inequality of information quantities

  • Authors:
  • Z. Zhang;R. W. Yeung

  • Affiliations:
  • Commun. Sci. Inst., Univ. of Southern California, Los Angeles, CA;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 1997

Quantified Score

Hi-index 754.90

Visualization

Abstract

Given n discrete random variables Ω={X1,…,Xn}, associated with any subset α of {1,2,…,n}, there is a joint entropy H(Xα) where Xα={Xi: i∈α}. This can be viewed as a function defined on 2{1,2,…,n} taking values in [0, +∞). We call this function the entropy function of Ω. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function is two-alternative. These properties are the so-called basic information inequalities of Shannon's information measures. An entropy function can be viewed as a 2n -1-dimensional vector where the coordinates are indexed by the subsets of the ground set {1,2,…,n}. As introduced by Yeng (see ibid., vol.43, no.6, p.1923-34, 1997) Γn stands for the cone in IR(2n-1) consisting of all vectors which have all these properties. Let Γn* be the set of all 2n -1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. A fundamental information-theoretic problem is whether or not Γ¯n*=Γn. Here Γ¯n * stands for the closure of the set Γn*. We show that Γ¯n* is a convex cone, Γ2*=Γ2, Γ3*≠Γ3, but Γ¯3 *=Γ3. For four random variables, we have discovered a conditional inequality which is not implied by the basic information inequalities of the same set of random variables. This lends an evidence to the plausible conjecture that Γ¯n*≠Γn for n>3