Unscented message passing for arbitrary continuous variables in Bayesian networks

  • Authors:
  • Wei Sun;Kuo-Chu Chang

  • Affiliations:
  • Department of Systems Engineering and Operations Research, George Mason University, Fairfax, VA;Department of Systems Engineering and Operations Research, George Mason University, Fairfax, VA

  • Venue:
  • AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Since Bayesian network (BN) was introduced in the field of artificial intelligence in 1980s, a number of inference algorithms have been developed for probabilistic reasoning. However, when continuous variables are present in Bayesian networks, their dependence relationships could be nonlinear and their probability distributions could be arbitrary. So far no efficient inference algorithm could deal with this case except Monte Carlo simulation methods such as Likelihood Weighting. But with unlikely evidence, simulation methods could be very slow to converge. In this paper, we propose an efficient approximate inference algorithm called Unscented Message Passing (UMP-BN) for Bayesian networks with arbitrary continuous variables. UMP-BN combines unscented transformation - a deterministic sampling method, and Pearl's message passing algorithm to provide the estimates of the first two moments of the posterior distributions. We test this algorithm with several networks including the ones with nonlinear and/or non-Gaussian variables. The numerical experiments show that UMP-BN converges very fast and produces promising results.