Affective computing
Introduction to Reinforcement Learning
Introduction to Reinforcement Learning
Hardware companions?: what online AIBO discussion forums reveal about the human-robotic relationship
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
A tool to study affective touch
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Gesture recognition in the haptic creature
EuroHaptics'10 Proceedings of the 2010 international conference on Haptics: generating and perceiving tangible sensations, Part I
Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
IEEE Transactions on Affective Computing
Skweezee studio: turn your own plush toys into interactive squeezable objects
Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction
Hi-index | 0.00 |
Over the last decade, the surprising fact has emerged that machines can possess therapeutic power. Due to the many healing qualities of touch, one route to such power is through haptic emotional interaction, which requires sophisticated touch sensing and interpretation. We explore the development of touch recognition technologies in the context of a furry artificial lap-pet, with the ultimate goal of creating therapeutic interactions by sensing human emotion through touch. In this work, we build upon a previous design for a new type of fur-based touch sensor. Here, we integrate our fur sensor with a piezoresistive fabric location/pressure sensor, and adapt the combined design to cover a curved creature-like object. We then use this interface to collect synchronized time-series data from the two sensors, and perform machine learning analysis to recognize 9 key affective touch gestures. In a study of 16 participants, our model averages 94% recognition accuracy when trained on individuals, and 86% when applied to the combined set of all participants. The model can also recognize which participant is touching the prototype with 79% accuracy. These results promise a new generation of emotionally intelligent machines, enabled by affective touch gesture recognition.