Learning nonlinear dynamical systems using an EM algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Fast particle smoothing: if I had a million particles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Training Recurrent Networks by Evolino
Neural Computation
Hi-index | 0.00 |
Recurrent neural networks serve as black-box models for nonlinear dynamical systems identification and time series prediction. Training of recurrent networks typically minimizes the quadratic difference of the network output and an observed time series. This implicitely assumes that the dynamics of the underlying system is deterministic, which is not a realistic assumption in many cases. In contrast, state-space models allow for noise in both the internal state transitions and the mapping from internal states to observations. Here, we consider recurrent networks as nonlinear state space models and suggest a training algorithm based on Expectation-Maximization. A nonlinear transfer function for the hidden neurons leads to an intractable inference problem. We investigate the use of a Particle Smoother to approximate the E-step and simultaneously estimate the expectations required in the M-step. The method is demonstrated for a sythetic data set and a time series prediction task arising in radiation therapy where it is the goal to predict the motion of a lung tumor during respiration.