Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Adaptive Probabilistic Networks with Hidden Variables
Machine Learning - Special issue on learning with probabilistic representations
Probabilistic Networks and Expert Systems
Probabilistic Networks and Expert Systems
Stable local computation with conditional Gaussian distributions
Statistics and Computing
Exact Inference in Networks with Discrete Children of Continuous Parents
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Inference in Hybrid Networks: Theoretical Limits and Practical Algorithms
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Axioms for probability and belief-function proagation
UAI '88 Proceedings of the Fourth Annual Conference on Uncertainty in Artificial Intelligence
A general algorithm for approximate inference and its application to hybrid bayes nets
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
A variational approximation for Bayesian networks with discrete and continuous latent variables
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Nonuniform dynamic discretization in hybrid networks
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Nonlinear deterministic relationships in bayesian networks
ECSQARU'05 Proceedings of the 8th European conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Belief update in CLG Bayesian networks with lazy propagation
International Journal of Approximate Reasoning
A simple graphical approach for understanding probabilistic inference in Bayesian networks
Information Sciences: an International Journal
Efficiency of influence diagram models with continuous decision variables
Decision Support Systems
Operations for inference in continuous Bayesian networks with linear deterministic variables
International Journal of Approximate Reasoning
The independent choice logic and beyond
Probabilistic inductive logic programming
Parameter estimation and model selection for mixtures of truncated exponentials
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
Mixtures of truncated basis functions
International Journal of Approximate Reasoning
Availability modelling of repairable systems using Bayesian networks
Engineering Applications of Artificial Intelligence
Modelling and inference with Conditional Gaussian Probabilistic Decision Graphs
International Journal of Approximate Reasoning
Learning mixtures of truncated basis functions from data
International Journal of Approximate Reasoning
Hi-index | 0.00 |
Mixtures of truncated exponentials (MTE) potentials are an alternative to discretization for solving hybrid Bayesian networks. Any probability density function (PDF) can be approximated with an MTE potential, which can always be marginalized in closed form. This allows propagation to be done exactly using the Shenoy-Shafer architecture for computing marginals, with no restrictions on the construction of a join tree. This paper presents MTE potentials that approximate an arbitrary normal PDF with any mean and a positive variance. The properties of these MTE potentials are presented, along with examples that demonstrate their use in solving hybrid Bayesian networks. Assuming that the joint density exists, MTE potentials can be used for inference in hybrid Bayesian networks that do not fit the restrictive assumptions of the conditional linear Gaussian (CLG) model, such as networks containing discrete nodes with continuous parents.