An information theoretic analysis of maximum likelihood mixture estimation for exponential families

  • Authors:
  • Arindam Banerjee;Inderjit Dhillon;Joydeep Ghosh;Srujana Merugu

  • Affiliations:
  • Univ of Texas at Austin, Austin, TX;Univ of Texas at Austin, Austin, TX;Univ of Texas at Austin, Austin, TX;Univ of Texas at Austin, Austin, TX

  • Venue:
  • ICML '04 Proceedings of the twenty-first international conference on Machine learning
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

An important task in unsupervised learning is maximum likelihood mixture estimation (MLME) for exponential families. In this paper, we prove a mathematical equivalence between this MLME problem and the rate distortion problem for Bregman divergences. We also present new theoretical results in rate distortion theory for Bregman divergences. Further, an analysis of the problems as a trade-off between compression and preservation of information is presented that yields the information bottleneck method as an interesting special case.