Default-reasoning with models

  • Authors:
  • Roni Khardon;Dan Roth

  • Affiliations:
  • Aiken Computation Laboratory, Harvard University, Cambridge, MA;Aiken Computation Laboratory, Harvard University, Cambridge, MA

  • Venue:
  • IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Reasoning with model-based representations is an intuitive paradigm, which has been shown to be theoretically sound and to possess some computational advantages over reasoning with formula-based representations of knowledge. In this paper we present more evidence to the utility of such representations. In real life situations, one normally completes a lot of missing "context" information when answering queries. We model this situation by augmenting the available knowledge about the world with context-specific information; we show that reasoning with model-based representations can be done efficiently in the presence of varying context information. We then consider the task of default reasoning. We show that default reasoning is a generalization of reasoning within context, in which the reasoner has many "context" rules, which may be conflicting. We characterize the cases in which model-based reasoning supports efficient default reasoning and develop algorithms that handle efficiently fragments of Reiter's default logic. In particular, this includes cases in which performing the default reasoning task with the traditional, formula-based, representation is intractable. Further, we argue that these results support an incremental view of reasoning in a natural way.