Incremental nonmonotonic parsing through semantic self-organization

  • Authors:
  • Marshall Reeves Mayberry, III;Risto Miikkulainen

  • Affiliations:
  • -;-

  • Venue:
  • Incremental nonmonotonic parsing through semantic self-organization
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Subsymbolic systems have been successfully used to model several aspects of human language processing. Subsymbolic parsers are appealing because they allow combining syntactic, semantic, and thematic constraints in sentence interpretation and non-monotonically revising that interpretation while incrementally processing a sentence. Such parsers are also cognitively plausible: processing is robust and multiple interpretations are simultaneously activated when the input is ambiguous. Yet, it has proven very difficult to scale them up to realistic language. They have limited memory capacity, training takes a long time, and it is difficult to represent linguistic structure. A new connectionist model, INSOMNet, scales up the subsymbolic approach by utilizing semantic self-organization. INSOMNet was trained on semantic dependency graph representations of the recently-released LinGO Redwoods HPSG Treebank of sentences from the VerbMobil project. The results show that INSOMNet accurately learns to represent these semantic dependencies and generalizes to novel structures. Further evaluation of INSOMNet on the original VerbMobil sentences transcribed with annotations for spoken language demonstrates robust parsing of noisy input, while graceful degradation in performance from adding noise to the network weights underscores INSOMNet's tolerance to damage. Finally, the cognitive plausibility of the model is shown on a standard psycholinguistic benchmark, in which INSOMNet demonstrates expectations and defaults, coactivation of multiple interpretations, nonmonotonicity, and semantic priming.