On the relationship between lexical semantics and syntax for the inference of context-free grammars

  • Authors:
  • Tim Oates;Tom Armstrong;Justin Harris;Mark Nejman

  • Affiliations:
  • Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, MD;Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, MD;Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, MD;Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, MD

  • Venue:
  • AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Context-free grammars cannot be identified in the limit from positive examples (Gold 1967), yet natural language grammars are more powerful than context-free grammars and humans learn them with remarkable ease from positive examples (Marcus 1993). Identifiability results for formal languages ignore a potentially powerful source of information available to learners of natural languages, namely, meanings. This paper explores the learnability of syntax (i.e. context-free grammars) given positive examples and knowledge of lexical semantics, and the learnability of lexical semantics given knowledge of syntax. The long-term goal is to develop an approach to learning both syntax and semantics that bootstraps itself, using limited knowledge about syntax to infer additional knowledge about semantics, and limited knowledge about semantics to infer additional knowledge about syntax.