Learning word meanings from examples

  • Authors:
  • Robert C. Berwick

  • Affiliations:
  • MIT Artificial Intelligence Laboratory, Cambridge, MA

  • Venue:
  • IJCAI'83 Proceedings of the Eighth international joint conference on Artificial intelligence - Volume 1
  • Year:
  • 1983

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes work in progress on a computer program that uses syntactic constraints to derive the meanings of verbs from an analysis of simple English example stories. The central idea is an extension of Winston's (Winston 1975) program that learned the structural descriptions of blocks world scenes. In the new research, English verbs take the place of blocks world objects like ARCH and TOWER, with frame-based descriptions of causal relationships serving as the structural descriptions. Syntactic constraints derived from the parsing of story plots are used to drive an analogical matching procedure. Analogical matching gives a way to compare descriptions of known words to unknown words. The "meaning" of a new verb is learned by matching pan of the causal network description of a story precis containing the unknown word to a set of such descriptions derived from similar stories that contain only known words. The best match forges an assignment between objects and relations such that the unknown verb is matched to a known verb, with the assignment being guided by syntactic constraints. The causal network surrounding the unknown item is then used as a scaffolding to construct a network representing the use of the novel word in a particular context. Words (and their associated stories) that are "best matches" are grouped together into a similarity network, according to the match score.