The EM algorithm for graphical association models with missing data
Computational Statistics & Data Analysis - Special issue dedicated to Toma´sˇ Havra´nek
Dynamic bayesian networks: representation, inference and learning
Dynamic bayesian networks: representation, inference and learning
A Comparison of Algorithms for Inference and Learning in Probabilistic Graphical Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Tree-based reparameterization framework for analysis of sum-product and related algorithms
IEEE Transactions on Information Theory
Hi-index | 0.01 |
Factor graphs allow large probability distributions to be stored efficiently and facilitate fast computation of marginal probabilities, but the difficulty of training them has limited their use. Given a large set of data points, the training process should yield factors for which the observed data has a high likelihood. We present a factor graph learning algorithm which on each iteration merges adjacent factors, performs expectation maximization on the resulting modified factor graph, and then splits the joined factors using non-negative matrix factorization. We show that this multifactor expectation maximization algorithm converges to the global maximum of the likelihood for difficult learning problems much faster and more reliably than traditional expectation maximization.