Syntactic systematicity in sentence processing with a recurrent self-organizing network

  • Authors:
  • Igor Farkaš;Matthew W. Crocker

  • Affiliations:
  • Department of Applied Informatics, Comenius University, Mlynská dolina, 842 48 Bratislava, Slovak Republic;Department of Computational Linguistics and Phonetics, Saarland University, Saarbrücken 66041, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

As potential candidates for explaining human cognition, connectionist models of sentence processing must demonstrate their ability to behave systematically, generalizing from a small training set. It has recently been shown that simple recurrent networks and, to a greater extent, echo-state networks possess some ability to generalize in artificial language learning tasks. We investigate this capacity for a recently introduced model that consists of separately trained modules: a recursive self-organizing module for learning temporal context representations and a feedforward two-layer perceptron module for next-word prediction. We show that the performance of this architecture is comparable with echo-state networks. Taken together, these results weaken the criticism of connectionist approaches, showing that various general recursive connectionist architectures share the potential of behaving systematically.