Tracking and data association
Readings in nonmonotonic reasoning
Statistical methods for speech recognition
Statistical methods for speech recognition
An efficient algorithm for finding the M most probable configurationsin probabilistic expert systems
Statistics and Computing
On sequential Monte Carlo sampling methods for Bayesian filtering
Statistics and Computing
Bayesian Fault Detection and Diagnosis in Dynamic Systems
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Variational Learning for Switching State-Space Models
Neural Computation
Tractable inference for complex stochastic processes
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
International Journal of Approximate Reasoning
Methodological Review: A review of causal inference for biomedical informatics
Journal of Biomedical Informatics
Dynamic bayesian networks for visual surveillance with distributed cameras
EuroSSC'06 Proceedings of the First European conference on Smart Sensing and Context
Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks
International Journal of Approximate Reasoning
A logic for causal inference in time series with discrete and continuous variables
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Inference on networks of mixtures for robust robot mapping
International Journal of Robotics Research
Hi-index | 0.00 |
An important subclass of hybrid Bayesian networks are those that represent Conditional Linear Gaussian (CLG) distributions -- a distribution with a multivariate Gaussian component for each instantiation of the discrete variables. In this paper we explore the problem of inference in CLGs, and provide complexity resuits for an important class of CLGs, which includes Switching Kalman Filters. In particular, we prove that even if the CLG is restricted to an extremely simple structure of a polytree, the inference task is NP-hard. Furthermore, we show that, unless P=NP, even approximate inference on these simple networks is intractable. Given the often prohibitive computational cost of even approximate inference, we must take advantage of special domain properties which may enable efficient inference. We concentrate on the fault diagnosis domain, and explore several approximate inference algorithms. These algorithms try to find a small subset of Gaussians which are a good approximation to the full mixture distribution. We consider two Monte Carlo approaches and a novel approach that enumerates mixture components in order of prior probability. We compare these methods on a variety of problems and show that our novel algorithm is very promising for large, hybrid diagnosis problems.