Images to syntax: a neuropropositional model of language

  • Authors:
  • Paul Buchheit

  • Affiliations:
  • Department of Computer Information Systems, Washington College, 30 E. Lake St., Chicago, IL 60601, USA

  • Venue:
  • Cognitive Systems Research
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper argues that language acquisition can be explained through the interactions of neural networks that represent images and words. Language development, according to the hypothesis to be presented, is largely a learning process in which grammatical rules are derived from the universal capability for recognizing, through spatial and temporal attributes of neural connectivity, case roles and propositions in perceived and imagined mental images. Neural patterns representing sounds and words are associated with image objects, and more abstract part-of-speech patterns form expectations that lead to syntactic rules. A conceptual model of the image-syntactic hypothesis is outlined in a series of steps to describe the transition from mental images to syntactic constructions.