Integrating Linguistic Primitives in Learning Context-Dependent Representation

  • Authors:
  • Samuel W. K. Chan

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Knowledge and Data Engineering
  • Year:
  • 2001

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper presents an explicit connectionist-inspired, language learning model in which the process of settling on a particular interpretation for a sentence emerges from the interaction of a set of 驴soft驴 lexical, semantic, and syntactic primitives. We address how these distinct linguistic primitives can be encoded from different modular knowledge sources but strongly involved in an interactive processing in such a way as to make implicit linguistic information explicit. The learning of a quasi-logical form, called context-dependent representation, is inherently incremental and dynamical in such a way that every semantic interpretation will be related to what has already been presented in the context created by prior utterances. With the aid of the context-dependent representation, the capability of the language learning model in text understanding is strengthened. This approach also shows how the recursive and compositional role of a sentence as conveyed in the syntactic structure can be modeled in a neurobiologically motivated linguistics based on dynamical systems rather on combinatorial symbolic architecture. Experiments with more than 2,000 sentences in different languages illustrating the influences of the context-dependent representation on semantic interpretation, among other issues, are included.