Head gesture sonification for supporting social interaction

  • Authors:
  • Thomas Hermann;Alexander Neumann;Sebastian Zehe

  • Affiliations:
  • Bielefeld University, Bielefeld, Germany;Bielefeld University, Bielefeld, Germany;Bielefeld University, Bielefeld, Germany

  • Venue:
  • Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we introduce two new methods for real-time sonification of head movements and head gestures. Head gestures such as nodding or shaking the head are important non-verbal back-channelling signals which facilitate coordination and alignment of communicating interaction partners. Visually impaired persons cannot interpret such non-verbal signals, same as people in mediated communication (e.g. on the phone), or cooperating users whose visual attention is focused elsewhere. We introduce our approach to tackle these issues, our sensing setup and two different sonification methods. A first preliminary study on the recognition of signals shows that subjects understand the gesture type even without prior explanation and can estimate gesture intensity and frequency with no or little training.