Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Elements of information theory
Elements of information theory
Connectionist learning of belief networks
Artificial Intelligence
Approximating probabilistic inference in Bayesian belief networks is NP-hard
Artificial Intelligence
Using Helmholtz machines to analyze multi-channel neuronal recordings
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
An introduction to variational methods for graphical models
Learning in graphical models
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Variational methods for inference and estimation in graphical models
Variational methods for inference and estimation in graphical models
Mean field theory for sigmoid belief networks
Journal of Artificial Intelligence Research
Concentration inequalities for the missing mass and for histogram rule error
The Journal of Machine Learning Research
Proceedings of the 2006 conference on ECAI 2006: 17th European Conference on Artificial Intelligence August 29 -- September 1, 2006, Riva del Garda, Italy
Mean-field methods for a special class of belief networks
Journal of Artificial Intelligence Research
Active tuples-based scheme for bounding posterior beliefs
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
We study two-layer belief networks of binary random variables in which the conditional probabilities Pr [child|parents] depend monotonically on weighted sums of the parents. In large networks where exact probabilistic inference is intractable, we show how to compute upper and lower bounds on many probabilities of interest. In particular, using methods from large deviation theory, we derive rigorous bounds on marginal probabilities such as Pr[children] and prove rates of convergence for the accuracy of our bounds as a function of network size. Our results apply to networks with generic transfer function parameterizations of the conditional probability tables, such as sigmoid and noisy-OR. They also explicitly illustrate the types of averaging behavior that can simplify the problem of inference in large networks.