Improving recurrent neural network performance using transfer entropy

  • Authors:
  • Oliver Obst;Joschka Boedecker;Minoru Asada

  • Affiliations:
  • CSIRO ICT Centre, Adaptive Systems, Epping, NSW, Australia and School of Information Technologies, The University of Sydney, NSW, Australia;Department of Adaptive Machine Systems, Osaka University, Osaka, Japan and JST ERATO Asada Synergistic Intelligence Project, Osaka, Japan;Department of Adaptive Machine Systems, Osaka University, Osaka, Japan and JST ERATO Asada Synergistic Intelligence Project, Osaka, Japan

  • Venue:
  • ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Reservoir computing approaches have been successfully applied to a variety of tasks. An inherent problem of these approaches, is, however, their variation in performance due to fixed random initialisation of the reservoir. Self-organised approaches like intrinsic plasticity have been applied to improve reservoir quality, but do not take the task of the system into account. We present an approach to improve the hidden layer of recurrent neural networks, guided by the learning goal of the system. Our reservoir adaptation optimises the information transfer at each individual unit, dependent on properties of the information transfer between input and output of the system. Using synthetic data, we show that this reservoir adaptation improves the performance of offline echo state learning and Recursive Least Squares Online Learning.