Instance-Based Learning Algorithms
Machine Learning
Artificial Intelligence Review - Special issue on lazy learning
Classification by pairwise coupling
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
A guided tour to approximate string matching
ACM Computing Surveys (CSUR)
Self-Organizing Maps
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Robotic Perception of Material: Experiments with Shape-Invariant Acoustic Measures of Material Type
The 4th International Symposium on Experimental Robotics IV
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Interactive object recognition using proprioceptive and auditory feedback
International Journal of Robotics Research
Online learning of exploratory behavior through human-robot interaction
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
Human beings can perceive object properties such as size, weight, and material type based solely on the sounds that the objects make when an action is performed on them. In order to be successful, the household robots of the near future must also be capable of learning and reasoning about the acoustic properties of everyday objects. Such an ability would allow a robot to detect and classify various interactions with objects that occur outside of the robot's field of view. This paper presents a framework that allows a robot to infer the object and the type of behavioral interaction performed with it from the sounds generated by the object during the interaction. The framework is evaluated on a 7-d.o.f. Barrett WAM robot which performs grasping, shaking, dropping, pushing and tapping behaviors on 36 different household objects. The results show that the robot can learn models that can be used to recognize objects (and behaviors performed on objects) from the sounds generated during the interaction. In addition, the robot can use the learned models to estimate the similarity between two objects in terms of their acoustic properties.