Self-organized reservoirs and their hierarchies

  • Authors:
  • Mantas Lukoševičius

  • Affiliations:
  • Jacobs University Bremen, Bremen, Germany

  • Venue:
  • ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We investigate how unsupervised training of recurrent neural networks (RNNs) and their deep hierarchies can benefit a supervised task like temporal pattern detection. The RNNs are fully and fast trained by unsupervised algorithms and only supervised feed-forward readouts are used. The unsupervised RNNs are shown to perform better in a rigorous comparison against state-of-art random reservoir networks. Unsupervised greedy bottom-up trained hierarchies of such RNNs are shown being capable of big performance improvements over single layer setups.