Learning exponential state-growth languages by hill climbing

  • Authors:
  • W. Tabor

  • Affiliations:
  • Dept. of Psychol., Connecticut Univ., Storrs, CT, USA

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Training recurrent neural networks on infinite state languages has been successful with languages in which the minimal number of machine states grows linearly with the length of the sentence, but has faired poorly with exponential state-growth languages. The new architecture learns several exponential state-growth languages in near perfect by hill climbing.