Minimax Entropy Principle and Its Application to Texture Modeling

  • Authors:
  • Song Chun Zhu;Ying Nian Wu;David Mumford

  • Affiliations:
  • Division of Applied Mathematics, Brown University, Providence, RI 02912, U.S.A.;Department of Statistics, University of Michigan, Ann Arbor, MI 48109, U.S.A.;Division of Applied Mathematics, Brown University, Providence, RI 02912, U.S.A.

  • Venue:
  • Neural Computation
  • Year:
  • 1997

Quantified Score

Hi-index 0.01

Visualization

Abstract

This article proposes a general theory and methodology, called the minimax entropy principle, for building statistical models for images (or signals) in a variety of applications. This principle consists of two parts. The first is the maximum entropy principle for feature binding (or fusion): for a given set of observed feature statistics, a distribution can be built to bind these feature statistics together by maximizing the entropy over all distributions that reproduce them. The second part is the minimum entropy principle for feature selection: among all plausible sets of feature statistics, we choose the set whose maximum entropy distribution has the minimum entropy. Computational and inferential issues in both parts are addressed; in particular, a feature pursuit procedure is proposed for approximately selecting the optimal set of features. The minimax entropy principle is then corrected by considering the sample variation in the observed feature statistics, and an information criterion for feature pursuit is derived. The minimax entropy principle is applied to texture modeling, where a novel Markov random field (MRF) model, called FRAME (filter, random field, and minimax entropy), is derived, and encouraging results are obtained in experiments on a variety of texture images. The relationship between our theory and the mechanisms of neural computation is also discussed.