Modeling dominance in group conversations using nonverbal activity cues
IEEE Transactions on Audio, Speech, and Language Processing - Special issue on multimodal processing in speech-based interactions
Automatic role recognition based on conversational and prosodic behaviour
Proceedings of the international conference on Multimedia
Multimodal coordination: exploring relevant features and measures
Proceedings of the 2nd international workshop on Social signal processing
Joint ACM workshop on human gesture and behavior understanding: (J-HGBU'11)
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Hi-index | 0.00 |
Automatic analysis of human-human degree of coordination bears challenging questions. In this paper, we propose to automatically predict the degree of coordination between dyadic partners performing an imitation task. A subjective evaluation of their coordination was obtained via a questionnaire addressed to human judges. We extracted features from speech, gestures segmentation and synchronized movements to predict the coordination status of each dyad. Several features discriminated perfectly the examples from the low and high coordination classes.