Scaling up: solving POMDPs through value based clustering

  • Authors:
  • Yan Virin;Guy Shani;Solomon Eyal Shimony;Ronen Brafman

  • Affiliations:
  • Department of Computer Science, Ben-Gurion University, Beer-Sheva, Israel;Department of Computer Science, Ben-Gurion University, Beer-Sheva, Israel;Department of Computer Science, Ben-Gurion University, Beer-Sheva, Israel;Department of Computer Science, Ben-Gurion University, Beer-Sheva, Israel

  • Venue:
  • AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Partially Observable Markov Decision Processes (POMDPs) provide an appropriately rich model for agents operating under partial knowledge of the environment. Since finding an optimal POMDP policy is intractable, approximation techniques have been a main focus of research, among them point-based algorithms, which scale up relatively well - up to thousands of states. An important decision in a point-based algorithm is the order of backup operations over belief states. Prioritization techniques for ordering the sequence of backup operations reduce the number of needed backups considerably, but involve significant overhead. This paper suggests a new way to order backups, based on a soft clustering of the belief space. Our novel soft clustering method relies on the solution of the underlying MDP. Empirical evaluation verifies that our method rapidly computes a good order of backups, showing orders of magnitude improvement in runtime over a number of benchmarks.