Elements of information theory
Elements of information theory
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Relative Expected Instantaneous Loss Bounds
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
A divisive information theoretic feature clustering algorithm for text classification
The Journal of Machine Learning Research
An information-theoretic analysis of hard and soft assignment methods for clustering
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Semi-supervised model-based document clustering: A comparative study
Machine Learning
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Region-Based Active Contours with Exponential Family Observations
Journal of Mathematical Imaging and Vision
Mixed-membership naive Bayes models
Data Mining and Knowledge Discovery
Error concealment by means of motion refinement and regularized bregman divergence
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
An important task in unsupervised learning is maximum likelihood mixture estimation (MLME) for exponential families. In this paper, we prove a mathematical equivalence between this MLME problem and the rate distortion problem for Bregman divergences. We also present new theoretical results in rate distortion theory for Bregman divergences. Further, an analysis of the problems as a trade-off between compression and preservation of information is presented that yields the information bottleneck method as an interesting special case.