Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Connectionist learning of belief networks
Artificial Intelligence
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient learning in Boltzmann machines using linear response theory
Neural Computation
Maximum conditional likelihood via bound maximization and the CEM algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
Introduction to Bayesian Networks
Introduction to Bayesian Networks
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Elicitation of probabilities for belief networks: combining qualitative and quantitative information
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Iterative Proportional Fitting (IPF), combined with EM, is commonly used as an algorithm for likelihood maximization in undirected graphical models. In this paper, we present two iterative algorithms that generalize upon IPF. The first one is for likelihood maximization in discrete chain factor graphs, which we define as a wide class of discrete variable models including undirected graphical models and Bayesian networks, but also chain graphs and sigmoid belief networks. The second one is for conditional likelihood maximization in standard undirected models and Bayesian networks. In both algorithms, the iteration steps are expressed in closed form. Numerical simulations show that the algorithms are competitive with state of the art methods.