Belief management for high-level robot programs

  • Authors:
  • Stephan Gspandl;Ingo Pill;Michael Reip;Gerald Steinbauer;Alexander Ferrein

  • Affiliations:
  • Institute for Software Technology, Graz University of Technology, Graz, Austria;Institute for Software Technology, Graz University of Technology, Graz, Austria;Institute for Software Technology, Graz University of Technology, Graz, Austria;Institute for Software Technology, Graz University of Technology, Graz, Austria;Knowledge-Based Systems Group, RWTH Aachen University, Aachen, Germany

  • Venue:
  • IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The robot programming and plan language IndiGolog allows for on-line execution of actions and offline projections of programs in dynamic and partly unknown environments. Basic assumptions are that the outcomes of primitive and sensing actions are correctly modeled, and that the agent is informed about all exogenous events beyond its control. In real-world applications, however, such assumptions do not hold. In fact, an action's outcome is error-prone and sensing results are noisy. In this paper, we present a belief management system in IndiGolog that is able to detect inconsistencies between a robot's modeled belief and what happened in reality. The system furthermore derives explanations and maintains a consistent belief. Our main contributions are (1) a belief management system following a history-based diagnosis approach that allows an agent to actively cope with faulty actions and the occurrence of exogenous events; and (2) an implementation in IndiGolog and experimental results from a delivery domain.