Entropy bounds for a Markov random subfield

  • Authors:
  • Matthew G. Reyes;David L. Neuhoff

  • Affiliations:
  • EECS Department, University of Michigan, Ann Arbor, MI;EECS Department, University of Michigan, Ann Arbor, MI

  • Venue:
  • ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Given a Markov random field (MRF) X defined by potentials on a graph G = (V,E), and given a subset U ⊂ V of the sites on which X is defined, we prove, under a positive correlation constraint on the MRF, that the entropy of the subfield XU is upper bounded by the entropy of an MRF defined on the subgraph induced by U with potentials taken directly from those assigned to U in G. To prove this we use exponential family representations of MRFs. We first show that the entropy of an MRF is monotone decreasing in the exponential parameters. We then use the Maximum Entropy principle and a well-known result from information geometry to show that the marginal entropy of XU is upper bounded by the MRF on the induced subgraph with moments matching the marginal distribution. We then use the convexity of the log-partition function to show that to match the marginal moments on the induced subgraph, the exponential coordinates on the induced subgraph are component-wise greater than the corresponding parameter of the original exponential characterization. Our result follows from monotonicity.