Assembling the senses: towards the design of cooperative interfaces for visually impaired users
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Sonification for supporting joint attention in dyadic augmented reality-based cooperation
Proceedings of the 8th Audio Mostly Conference
Hi-index | 0.00 |
In this paper we introduce two new methods for real-time sonification of head movements and head gestures. Head gestures such as nodding or shaking the head are important non-verbal back-channelling signals which facilitate coordination and alignment of communicating interaction partners. Visually impaired persons cannot interpret such non-verbal signals, same as people in mediated communication (e.g. on the phone), or cooperating users whose visual attention is focused elsewhere. We introduce our approach to tackle these issues, our sensing setup and two different sonification methods. A first preliminary study on the recognition of signals shows that subjects understand the gesture type even without prior explanation and can estimate gesture intensity and frequency with no or little training.