Interactive object recognition using proprioceptive and auditory feedback

  • Authors:
  • Jivko Sinapov;Taylor Bergquist;Connor Schenck;Ugonna Ohiri;Shane Griffith;Alexander Stoytchev

  • Affiliations:
  • Developmental Robotics Laboratory, Iowa State University, USA;Developmental Robotics Laboratory, Iowa State University, USA;Developmental Robotics Laboratory, Iowa State University, USA;Developmental Robotics Laboratory, Iowa State University, USA;Developmental Robotics Laboratory, Iowa State University, USA;Developmental Robotics Laboratory, Iowa State University, USA

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose a method for interactive recognition of household objects using proprioceptive and auditory feedback. In our experiments, the robot observed the changes in its proprioceptive and auditory sensory streams while performing five exploratory behaviors (lift, shake, drop, crush, and push) on 50 common household objects (e.g. bottles, cups, balls, toys, etc.). The robot was tasked with recognizing the objects it was manipulating by feeling them and listening to the sounds that they make without using any visual information. The results show that both proprioception and audio, coupled with exploratory behaviors, can be used successfully for object recognition. Furthermore, the robot was able to integrate feedback from the two modalities, to achieve even better recognition accuracy. Finally, the results show that the robot can boost its recognition rate even further by applying multiple different exploratory behaviors on the object.