Lagrange interpolation on Chebyshev points of two variables
Journal of Approximation Theory
Importance sampling in Bayesian networks using probability trees
Computational Statistics & Data Analysis
Stable local computation with conditional Gaussian distributions
Statistics and Computing
Mixtures of Truncated Exponentials in Hybrid Bayesian Networks
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Approximate probability propagation with mixtures of truncated exponentials
International Journal of Approximate Reasoning
Parameter estimation and model selection for mixtures of truncated exponentials
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
International Journal of Approximate Reasoning
A re-definition of mixtures of polynomials for inference in hybrid Bayesian networks
ECSQARU'11 Proceedings of the 11th European conference on Symbolic and quantitative approaches to reasoning with uncertainty
A variational approximation for Bayesian networks with discrete and continuous latent variables
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Inference in hybrid networks: theoretical limits and practical algorithms
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Exact inference in networks with discrete children of continuous parents
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Nonuniform dynamic discretization in hybrid networks
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
International Journal of Approximate Reasoning
Learning mixtures of truncated basis functions from data
International Journal of Approximate Reasoning
Hi-index | 0.00 |
We discuss two issues in using mixtures of polynomials (MOPs) for inference in hybrid Bayesian networks. MOPs were proposed by Shenoy and West for mitigating the problem of integration in inference in hybrid Bayesian networks. First, in defining MOP for multi-dimensional functions, one requirement is that the pieces where the polynomials are defined are hypercubes. In this paper, we discuss relaxing this condition so that each piece is defined on regions called hyper-rhombuses. This relaxation means that MOPs are closed under transformations required for multi-dimensional linear deterministic conditionals, such as Z=X+Y, etc. Also, this relaxation allows us to construct MOP approximations of the probability density functions (PDFs) of the multi-dimensional conditional linear Gaussian distributions using a MOP approximation of the PDF of the univariate standard normal distribution. Second, Shenoy and West suggest using the Taylor series expansion of differentiable functions for finding MOP approximations of PDFs. In this paper, we describe a new method for finding MOP approximations based on Lagrange interpolating polynomials (LIP) with Chebyshev points. We describe how the LIP method can be used to find efficient MOP approximations of PDFs. We illustrate our methods using conditional linear Gaussian PDFs in one, two, and three dimensions, and conditional log-normal PDFs in one and two dimensions. We compare the efficiencies of the hyper-rhombus condition with the hypercube condition. Also, we compare the LIP method with the Taylor series method.