Increasing the expressivity of humanoid robots with variable gestural expressions

  • Authors:
  • Andre Viergutz;Tamara Flemisch;Raimund Dachselt

  • Affiliations:
  • Technische Universität Dresden, Dresden, Germany;Technische Universität Dresden, Dresden, Germany;Technische Universität Dresden, Dresden, Germany

  • Venue:
  • Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work aims at establishing a more variable, and thus attractive, communication with humanoids. For this purpose we developed a method of dynamically expressing emotions and intentions associated with parametrised gestures. First, gestures and possible parameters, which are necessary for the generation of a whole-body gesture set, were analyzed. A gesture's inner and outer expressivity is thereby defined which leads to the differentiation of single gestures and variable gestural expressions. Gestures are subsequently classified into (feedback) categories and related depending on their expressivity. We developed a gesture set consisting of six categories including over 50 variable gestures. As proof of concept, the gesture set has been implemented to allow for an easy and flexible authoring process of gesture-supported communication.