Improving the state space organization of untrained recurrent networks

  • Authors:
  • Michal Čerňanský;Matej Makula;Ľubica Beňušková

  • Affiliations:
  • Faculty of Informatics and Information Technologies, STU, Bratislava, Slovakia;Faculty of Informatics and Information Technologies, STU, Bratislava, Slovakia;Department of Computer Science, University of Otago, Dunedin, New Zealand

  • Venue:
  • ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recurrent neural networks are frequently used in cognitive science community for modeling linguistic structures. More or less intensive training process is usually performed but several works showed that untrained recurrent networks initialized with small weights can be also successfully used for this type of tasks. In this work we demonstrate that the state space organization of untrained recurrent neural network can be significantly improved by choosing appropriate input representations. We experimentally support this notion on several linguistic time series.