A sentence generation network that learns surface and abstract syntactic structures

  • Authors:
  • Martin Takac;Lubica Benuskova;Alistair Knott

  • Affiliations:
  • Dept. of Computer Science, University of Otago, Dunedin, New Zealand;Dept. of Computer Science, University of Otago, Dunedin, New Zealand;Dept. of Computer Science, University of Otago, Dunedin, New Zealand

  • Venue:
  • ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present a connectionist model of sentence generation based on the novel idea that sentence meanings are represented in the brain as sequences of sensorimotor signals which are replayed during sentence generation. Our model can learn surface patterns in language as well as abstract word-ordering conventions. The former is achieved by a recurrent network module; the latter by a feed-forward network that learns to inhibit overt pronunciation of predicted words in certain phases of sensorimotor sequence rehearsal. Another novel element of the model is adaptive switching of control based on uncertainty (entropy) of predicted word distributions. Experiments with the model show that it can learn the syntax, morphology and semantics of a target language and generalize well to unseen meanings/sentences.