Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Elements of information theory
Elements of information theory
Modern heuristic techniques for combinatorial problems
The EM algorithm for graphical association models with missing data
Computational Statistics & Data Analysis - Special issue dedicated to Toma´sˇ Havra´nek
Deterministic annealing EM algorithm
Neural Networks
An introduction to variational methods for graphical models
Learning in graphical models
A tutorial on learning with Bayesian networks
Learning in graphical models
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Multivariate Information Bottleneck
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Data perturbation for escaping local maxima in learning
Eighteenth national conference on Artificial intelligence
Discovering the hidden structure of complex dynamic systems
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
"Ideal Parent" structure learning for continuous variable networks
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Generalization from Observed to Unobserved Features by Clustering
The Journal of Machine Learning Research
Learning Bayesian network parameters under incomplete data with domain knowledge
Pattern Recognition
Better informed training of latent syntactic features
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Latent variable study algorithm based on grey cluster relation analysis method
CCDC'09 Proceedings of the 21st annual international conference on Chinese control and decision conference
Learning the behavior model of a robot
Autonomous Robots
Tutorial and selected approaches on parameter learning in bayesian network with incomplete data
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
Learning with hidden variables is a central challenge in probabilistic graphical models that has important implications for many real-life problems. The classical approach is using the Expectation Maximization (EM) algorithm. This algorithm, however, can get trapped in local maxima. In this paper we explore a new approach that is based on the Information Bottleneck principle. In this approach, we view the learning problem as a tradeoff between two information theoretic objectives. The first is to make the hidden variables uninformative about the identity of specific instances. The second is to make the hidden variables informative about the observed attributes. By exploring different tradeoffs between these two objectives, we can gradually converge on a high-scoring solution. As we show, the resulting, Information Bottleneck Expectation Maximization (IB-EM) algorithm, manages to find solutions that are superior to standard EM methods.