Gibbs sampling in Bayesian networks (research note)
Artificial Intelligence
Elements of information theory
Elements of information theory
SIAM Review
Approximating Bayesian Belief Networks by Arc Removal
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Introduction to Variational Methods for Graphical Models
Machine Learning
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Mixtures of Truncated Exponentials in Hybrid Bayesian Networks
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Maximum Likelihood Learning of Conditional MTE Distributions
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Inference in hybrid Bayesian networks with mixtures of truncated exponentials
International Journal of Approximate Reasoning
Parameter estimation and model selection for mixtures of truncated exponentials
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
Nonuniform dynamic discretization in hybrid networks
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Answering queries in hybrid Bayesian networks using importance sampling
Decision Support Systems
Learning recursive probability trees from probabilistic potentials
International Journal of Approximate Reasoning
Inventory management with log-normal demand per unit time
Computers and Operations Research
International Journal of Approximate Reasoning
Learning mixtures of truncated basis functions from data
International Journal of Approximate Reasoning
Hi-index | 0.00 |
In this paper we propose a framework, called mixtures of truncated basis functions (MoTBFs), for representing general hybrid Bayesian networks. The proposed framework generalizes both the mixture of truncated exponentials (MTEs) framework and the Mixture of Polynomials (MoPs) framework. Similar to MTEs and MoPs, MoTBFs are defined so that the potentials are closed under combination and marginalization, which ensures that inference in MoTBF networks can be performed efficiently using the Shafer-Shenoy architecture. Based on a generalized Fourier series approximation, we devise a method for efficiently approximating an arbitrary density function using the MoTBF framework. The translation method is more flexible than existing MTE or MoP-based methods, and it supports an online/anytime tradeoff between the accuracy and the complexity of the approximation. Experimental results show that the approximations obtained are either comparable or significantly better than the approximations obtained using existing methods.