A conceptual dependency parser for natural language

  • Authors:
  • Roger C. Schank;Larry Tesler

  • Affiliations:
  • Computer Science Departments, Stanford University, Stanford, California;Computer Science Departments, Stanford University, Stanford, California

  • Venue:
  • COLING '69 Proceedings of the 1969 conference on Computational linguistics
  • Year:
  • 1969

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an operable automatic parser for natural language. The parser is not concerned with producing the syntactic structure of an input sentence. Instead, it is a conceptual parser, concerned with determining the underlying meaning of the input. Given a natural language input, the parser identifies and disambiguates the concepts derivable from that input and places them into a network that explicates their inter-relations with respect to the unambiguous meaning of the input.The parser utilizes a conceptually-oriented dependency grammar that has as its highest level the network which represents the underlying conceptual structure of a linguistic input. The parser also incorporates a language-free semantics that checks all possible conceptual dependencies with its own knowledge of the world.The parser is capable of learning new words and new constructions. It presently has a vocabulary of a few hundred words which enables it to operate in a psychiatric interviewing program without placing any restriction on the linguistic input.The theory behind the conceptual dependency is outlined in this paper and the parsing algorithm is explained in some detail.