Training multilayer perceptrons with the extended Kalman algorithm
Advances in neural information processing systems 1
A practical Bayesian framework for backpropagation networks
Neural Computation
Bayesian radial basis functions of variable dimension
Neural Computation
A unifying review of linear Gaussian models
Neural Computation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Learning Dynamic Bayesian Networks
Adaptive Processing of Sequences and Data Structures, International Summer School on Neural Networks, "E.R. Caianiello"-Tutorial Lectures
Learning Dynamical Models Using Expectation-Maximisation
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Modeling Dst with Recurrent EM Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Hi-index | 0.00 |
In this paper, we derive an EM algorithm for nonlinear state space models. We use it to estimate jointly the neural network weights, the model uncertainty and the noise in the data. In the E-step we apply a forward-backward Rauch-Tung-Striebel smoother to compute the network weights. For the M-step, we derive expressions to compute the model uncertainty and the measurement noise. We find that the method is intrinsically very powerful, simple and stable.