QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
ICARE: a component-based approach for multimodal interaction
UbiMob '04 Proceedings of the 1st French-speaking conference on Mobility and ubiquity computing
Image Analysis and Mathematical Morphology
Image Analysis and Mathematical Morphology
ThumbSpace: generalized one-handed input for touchscreen-based mobile devices
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
A survey of Tactile Human-Robot Interactions
Robotics and Autonomous Systems
Hi-index | 0.00 |
This paper presents a tactile language for controlling a robot through its artificial skin. This language greatly improves the multimodal human-robot communication by adding both redundant and inherently new ways of robot control through the tactile mode. We defined an interface for arbitrary tactile sensors, implemented a symbol recognition for multi-finger contacts, and integrated that together with a freely available character recognition software into an easy-to-extend system for tactile language processing that can also incorporate and process data from non-tactile interfaces. The recognized tactile symbols allow for both a direct control of the robot's tool center point as well as abstract commands like "stop" or "grasp object x with grasp type y". In addition to this versatility, the symbols are also extremely expressive since multiple parameters like direction, distance, and speed can be decoded from a single human finger stroke. Furthermore, our efficient symbol recognition implementation achieves real-time performance while being platform-independent. We have successfully used both a multi-touch finger pad and our artificial robot skin as tactile interfaces. We evaluated our tactile language system by measuring its symbol and angle recognition performance, and the results are promising.