Decentralized multi-robot cooperation with auctioned POMDPs

  • Authors:
  • Jesus Capitan;Matthijs T.J. Spaan;Luis Merino;Anibal Ollero

  • Affiliations:
  • University of Duisburg-Essen, Duisburg, Germany;Delft University of Technology, Delft, The Netherlands;Pablo de Olavide University, Seville, Spain;University of Seville, Seville, Spain

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Planning under uncertainty faces a scalability problem when considering multi-robot teams, as the information space scales exponentially with the number of robots. To address this issue, this paper proposes to decentralize multi-robot partially observable Markov decision processes (POMDPs) while maintaining cooperation between robots by using POMDP policy auctions. Auctions provide a flexible way of coordinating individual policies modeled by POMDPs and have low communication requirements. In addition, communication models in the multi-agent POMDP literature severely mismatch with real inter-robot communication. We address this issue by exploiting a decentralized data fusion method in order to efficiently maintain a joint belief state among the robots. The paper presents two different applications: environmental monitoring with unmanned aerial vehicles (UAVs); and cooperative tracking, in which several robots have to jointly track a moving target of interest. The first one is used as a proof of concept and illustrates the proposed ideas through different simulations. The second one adds real multi-robot experiments, showcasing the flexibility and robust coordination that our techniques can provide.