Elements of information theory
Elements of information theory
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information geometry on hierarchy of probability distributions
IEEE Transactions on Information Theory
Tree-based reparameterization framework for analysis of sum-product and related algorithms
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
A traffic video background extraction algorithm based on image content sensitivity
ICSI'10 Proceedings of the First international conference on Advances in Swarm Intelligence - Volume Part II
Hi-index | 0.00 |
Given a Markov random field (MRF) X defined by potentials on a graph G = (V,E), and given a subset U ⊂ V of the sites on which X is defined, we prove, under a positive correlation constraint on the MRF, that the entropy of the subfield XU is upper bounded by the entropy of an MRF defined on the subgraph induced by U with potentials taken directly from those assigned to U in G. To prove this we use exponential family representations of MRFs. We first show that the entropy of an MRF is monotone decreasing in the exponential parameters. We then use the Maximum Entropy principle and a well-known result from information geometry to show that the marginal entropy of XU is upper bounded by the MRF on the induced subgraph with moments matching the marginal distribution. We then use the convexity of the log-partition function to show that to match the marginal moments on the induced subgraph, the exponential coordinates on the induced subgraph are component-wise greater than the corresponding parameter of the original exponential characterization. Our result follows from monotonicity.