Using attention in belief revision

  • Authors:
  • Xueming Huang;Gordon I. McCalla;Eric Neufeld

  • Affiliations:
  • ARIES Laboratory, Department of Computational Science, University of Saskatchewan, Saskatoon, Saskatchewan, Canada;ARIES Laboratory, Department of Computational Science, University of Saskatchewan, Saskatoon, Saskatchewan, Canada;ARIES Laboratory, Department of Computational Science, University of Saskatchewan, Saskatoon, Saskatchewan, Canada

  • Venue:
  • AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 1
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

Belief revision for an intelligent system is usually computationally expensive. Here we tackle this problem by using focus in belief revision: that is, revision occurs only in a subset of beliefs under attention (or in focus). Attention can be shifted within the belief base, thus allowing use and revision of other subsets of beliefs. This attention-shifting belief revision architecture shows promise to allow efficient and natural revision of belief bases.