Interactive learning of the acoustic properties of household objects

  • Authors:
  • Jivko Sinapov;Mark Wiemer;Alexander Stoytchev

  • Affiliations:
  • Developmental Robotics Laboratory, Iowa State University;Developmental Robotics Laboratory, Iowa State University;Developmental Robotics Laboratory, Iowa State University

  • Venue:
  • ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human beings can perceive object properties such as size, weight, and material type based solely on the sounds that the objects make when an action is performed on them. In order to be successful, the household robots of the near future must also be capable of learning and reasoning about the acoustic properties of everyday objects. Such an ability would allow a robot to detect and classify various interactions with objects that occur outside of the robot's field of view. This paper presents a framework that allows a robot to infer the object and the type of behavioral interaction performed with it from the sounds generated by the object during the interaction. The framework is evaluated on a 7-d.o.f. Barrett WAM robot which performs grasping, shaking, dropping, pushing and tapping behaviors on 36 different household objects. The results show that the robot can learn models that can be used to recognize objects (and behaviors performed on objects) from the sounds generated during the interaction. In addition, the robot can use the learned models to estimate the similarity between two objects in terms of their acoustic properties.