Language acquisition: learning a hierarchy of phrases

  • Authors:
  • Uri Zernick

  • Affiliations:
  • GE Corporate Research and Development, Schenectady, New York and Computer Science Department, University of California, Los Angeles, California

  • Venue:
  • IJCAI'87 Proceedings of the 10th international joint conference on Artificial intelligence - Volume 1
  • Year:
  • 1987

Quantified Score

Hi-index 0.00

Visualization

Abstract

The hierarchical lexicon, in contrast to the traditional flat lexicon, enables a linguistic model to perform even in situations of incomplete knowledge: when a specific entry is missing, a more general entry can cover the gap. The question still remains regarding the construction of the lexicon itself. Since the lexicon is organized as a hierarchy, and not as a flat structure, phrases cannot simply be placed in the lexicon: they must be interconnected with other phrases in the hierarchy at the appropriate level of generality. Furthermore, since input examples arc always given in terms of specific phrases, phrases must be propagated up and down the hierarchy, starting at the bottom level. In this paper we describe a learning algorithm which is based on two existing machine-learning models: learning in a version space [Mitchell82], and learning by accumulating specific episodes in a dynamic memory [Kolodner84, Schank82]. The input required by the algorithm is a sequence of specific episodes, or training examples, from which lexical entries at various levels in the hierarchy arc generalized and specialized. The algorithm is embodied by the program RINA [Zernik87b] which models learning English phrases by a second language speaker.