A new semantics: merging propositional and distributional information

  • Authors:
  • Eduard Hovy

  • Affiliations:
  • University of Southern California

  • Venue:
  • IWCS '11 Proceedings of the Ninth International Conference on Computational Semantics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Despite hundreds of years of study on semantics, theories and representations of semantic content---the actual meaning of the symbols used in semantic propositions---remain impoverished. The traditional extensional and intensional models of semantics are difficult to actually flesh out in practice, and no large-scale models of this kind exist. Recently, researchers in Natural Language Processing (NLP) have increasingly treated topic signature word distributions (also called 'context vectors', 'topic models', 'language models', etc.) as a de facto placeholder for semantics at various levels of granularity. This talk argues for a new kind of semantics that combines traditional symbolic logic-based proposition-style semantics (of the kind used in older NLP) with (computation-based) statistical word distribution information (what is being called Distributional Semantics in modern NLP). The core resource is a single lexico-semantic 'lexicon' that can be used for a variety of tasks. I show how to define such a lexicon, how to build and format it, and how to use it for various tasks. Combining the two views of semantics opens many fascinating questions that beg study, including the operation of logical operators such as negation and modalities over word(sense) distributions, the nature of ontological facets required to define concepts, and the action of compositionality over statistical concepts.