A SOLID case for active bayesian perception in robot touch

  • Authors:
  • Nathan F. Lepora;Uriel Martinez-Hernandez;Tony J. Prescott

  • Affiliations:
  • Sheffield Center for Robotics (SCentRo), University of Sheffield, UK;Sheffield Center for Robotics (SCentRo), University of Sheffield, UK;Sheffield Center for Robotics (SCentRo), University of Sheffield, UK

  • Venue:
  • Living Machines'13 Proceedings of the Second international conference on Biomimetic and Biohybrid Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In a series of papers, we have formalized a Bayesian perception approach for robotics based on recent progress in understanding animal perception. The main principle is to accumulate evidence for multiple perceptual alternatives until reaching a preset belief threshold, formally related to sequential analysis methods for optimal decision making. Here, we extend this approach to active perception, by moving the sensor with a control strategy that depends on the posterior beliefs during decision making. This method can be used to solve problems involving Simultaneous Object Localization and IDentification (SOLID), or 'where and what'. Considering an example in robot touch, we find that active perception gives an efficient, accurate solution to the SOLID problem for uncertain object locations; in contrast, passive Bayesian perception, which lacked sensorimotor feedback, then performed poorly. Thus, active perception can enable robust sensing in unstructured environments.