A spiking neural sparse distributed memory implementation for learning and predicting temporal sequences

  • Authors:
  • J. Bose;S. B. Furber;J. L. Shapiro

  • Affiliations:
  • School of Computer Science, University of Manchester, Manchester, UK;School of Computer Science, University of Manchester, Manchester, UK;School of Computer Science, University of Manchester, Manchester, UK

  • Venue:
  • ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper we present a neural sequence machine that can learn temporal sequences of discrete symbols, and perform better than machines that use Elman’s context layer, time delay nets or shift register-like context memories. This machine can perform sequence detection, prediction and learning of new sequences. The network model is an associative memory with a separate store for the sequence context of a pattern. Learning is one-shot. The model is capable of both off-line and on-line learning. The machine is based upon a sparse distributed memory which is used to store associations between the current context and the input symbol. Numerical tests have been done on the machine to verify its properties. We have also shown that it is possible to implement the memory using spiking neurons.