Numerical Recipes in C: The Art of Scientific Computing
Numerical Recipes in C: The Art of Scientific Computing
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Monitoring a complex physical system using a hybrid dynamic bayes net
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Factor graphs and the sum-product algorithm
IEEE Transactions on Information Theory
Neural Decoding of Movements: From Linear to Nonlinear Trajectory Models
Neural Information Processing
EP for Efficient Stochastic Control with Obstacles
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Journal of Computational Neuroscience
Sparse Spatio-temporal Gaussian processes with general likelihoods
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Hi-index | 0.01 |
We formulate the problem of inference in nonlinear dynamical systems in the framework of expectation propagation, and propose two novel algorithms. The first algorithm is based on Laplace approximation and allows for iterated forward and backward passes. The second is based on repeated application of the unscented transform. It leads to an unscented Kalman smoother for which the dynamics need not be inverted explicitly. In experiments with a one-dimensional nonlinear dynamical system we show that for relatively low observation noise levels, the Laplace algorithm allows for the best estimates of the state means. The unscented algorithm however is more robust to high observation noise and always outperforms the conventional inference methods against which it was benchmarked.