Letters: Efficient online recurrent connectionist learning with the ensemble Kalman filter

  • Authors:
  • Derrick T. Mirikitani;Nikolay Nikolaev

  • Affiliations:
  • Department of Computing, Goldsmiths College, University of London, New Cross, London SE14 6NW, UK;Department of Computing, Goldsmiths College, University of London, New Cross, London SE14 6NW, UK

  • Venue:
  • Neurocomputing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

One of the main drawbacks for online learning of recurrent neural networks (RNNs) is the high computational cost of training. Much effort has been spent to reduce the computational complexity of online learning algorithms, usually focusing on the real time recurrent learning (RTRL) algorithm. Significant reductions in complexity of RTRL have been achieved, but with a tradeoff, degradation of model performance. We take a different approach to complexity reduction in online learning of RNNs through a sequential Bayesian filtering framework and propose the ensemble Kalman filter (EnKF) for derivative free parameter estimation. The EnKF provides an online training solution that under certain assumptions can reduce the computational complexity by two orders of magnitude from the original RTRL algorithm without sacrificing the modeling potential of the network. Through forecasting experiments on observed data from nonlinear systems, it is shown that the EnKF trained RNN outperforms other RNN training algorithms in terms of real computational time and also leads to models that produce better forecasts.