Adaptation-based explanation: extending script/frame theory to handle novel input

  • Authors:
  • Alex Kass

  • Affiliations:
  • Yale University, Department of Computer Science, New Haven, CT

  • Venue:
  • IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
  • Year:
  • 1989

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ability to develop hypotheses to explain new, unexpected experiences is a hallmark of human intelligence. It is also a crucial concern within artificial intelligence, since intelligent computer systems need to construct explanations in order to guide learning, to recover from planning failures, and to make sense of the stories that they read. The principal issue in designing a computer program that builds explanations is how to bring the system's causal knowledge to bear on a problem so that it can efficiently infer an unseen cause of observed events. In this paper we discuss some drawbacks of previous approaches to this problem, and present an alternative. The alternative involves extending script/frame theory via a system that adapts its stored explanations to new situations. We discuss the types of explanation failures that occur, and how the system employs adaptation strategies to repair those failures. The execution of one of the strategies is demonstrated.