A real-time head nod and shake detector
Proceedings of the 2001 workshop on Perceptive user interfaces
Contextual recognition of head gestures
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Recognition of Multi-Pose Head Gestures in Human Conversations
ICIG '07 Proceedings of the Fourth International Conference on Image and Graphics
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Learning large margin likelihoods for realtime head pose tracking
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Hi-index | 0.00 |
Head nods occur in virtually every face-to-face discussion. As part of the backchannel domain, they are not only used to express a 'yes', but also to display interest or enhance communicative attention. Detecting head nods in natural interactions is a challenging task as head nods can be subtle, both in amplitude and duration. In this study, we make use of findings in psychology establishing that the dynamics of head gestures are conditioned on the person's speaking status. We develop a multimodal method using audio-based self-context to detect head nods in natural settings. We demonstrate that our multimodal approach using the speaking status of the person under analysis significantly improved the detection rate over a visual-only approach.