A tactile language for intuitive human-robot communication

  • Authors:
  • Andreas J. Schmid;Martin Hoffmann;Heinz Woern

  • Affiliations:
  • University of Karlsruhe, Karlsruhe, Germany;University of Karlsruhe, Karlsruhe, Germany;University of Karlsruhe, Karlsruhe, Germany

  • Venue:
  • Proceedings of the 9th international conference on Multimodal interfaces
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a tactile language for controlling a robot through its artificial skin. This language greatly improves the multimodal human-robot communication by adding both redundant and inherently new ways of robot control through the tactile mode. We defined an interface for arbitrary tactile sensors, implemented a symbol recognition for multi-finger contacts, and integrated that together with a freely available character recognition software into an easy-to-extend system for tactile language processing that can also incorporate and process data from non-tactile interfaces. The recognized tactile symbols allow for both a direct control of the robot's tool center point as well as abstract commands like "stop" or "grasp object x with grasp type y". In addition to this versatility, the symbols are also extremely expressive since multiple parameters like direction, distance, and speed can be decoded from a single human finger stroke. Furthermore, our efficient symbol recognition implementation achieves real-time performance while being platform-independent. We have successfully used both a multi-touch finger pad and our artificial robot skin as tactile interfaces. We evaluated our tactile language system by measuring its symbol and angle recognition performance, and the results are promising.