Interactive robots as social partners and peer tutors for children: a field trial
Human-Computer Interaction
Affective touch for robotic companions
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Gesture recognition in the haptic creature
EuroHaptics'10 Proceedings of the 2010 international conference on Haptics: generating and perceiving tangible sensations, Part I
Eight lessons learned about non-verbal interactions through robot theater
ICSR'11 Proceedings of the Third international conference on Social Robotics
International Journal of Robotics Research
Clustering approach to characterize haptic expressions of emotions
ACM Transactions on Applied Perception (TAP)
Hi-index | 0.00 |
This paper describes the hardware and algorithms for a realtime social touch gesture recognition system. Early experiments involve a Sensate Bear test-rig with full body touch sensing, sensor visualization and gesture recognition capabilities. Algorithms are based on real humans interacting with a plush bear. In developing a preliminary gesture library with thirteen Symbolic Gestures and eight Touch Subtypes, we have taken the first steps toward a Robotic Touch API, showing that the Huggable robot behavior system will be able to stream currently active sensors to detect regional social gestures and local sub-gestures in realtime. The system demonstrates the infrastructure to detect three types of touching: social touch, local touch, and sensor-level touch.