A Context-Dependent Attention System for a Social Robot
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Sociable machines: expressive social exchange between humans and robots
Sociable machines: expressive social exchange between humans and robots
Toward a Real-Time Automated Measure of Empathy and Dominance
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Proceedings of the 3rd international workshop on Affective interaction in natural environments
Multimodal coordination: exploring relevant features and measures
Proceedings of the 2nd international workshop on Social signal processing
Towards the detection of social dominance in dialogue
Speech Communication
Nonverbal synchrony or random coincidence? how to tell the difference
COST'09 Proceedings of the Second international conference on Development of Multimodal Interfaces: active Listening and Synchrony
Attention via Synchrony: Making Use of Multimodal Cues in Social Learning
IEEE Transactions on Autonomous Mental Development
Estimating Dominance in Multi-Party Meetings Using Speaker Diarization
IEEE Transactions on Audio, Speech, and Language Processing
Estimating Cohesion in Small Groups Using Audio-Visual Nonverbal Behavior
IEEE Transactions on Multimedia
Mining Group Nonverbal Conversational Patterns Using Probabilistic Topic Models
IEEE Transactions on Multimedia
Hi-index | 0.00 |
Understanding the ability to coordinate with a partner constitutes a great challenge in social signal processing and social robotics. In this paper, we designed a child-adult imitation task to investigate how automatically computable cues on turn-taking and movements can give insight into high-level perception of coordination. First we collected a human questionnaire to evaluate the perceived coordination of the dyads. Then, we extracted automatically computable cues and information on dialog acts from the video clips. The automatic cues characterized speech and gestural turn-takings and coordinated movements of the dyad. We finally confronted human scores with automatic cues to search which cues could be informative on the perception of coordination during the task. We found that the adult adjusted his behavior according to the child need and that a disruption of the gestural turn-taking rhythm was badly perceived by the judges. We also found, that judges rated negatively the dyads that talked more as speech intervenes when the child had difficulties to imitate. Finally, coherence measures between the partners' movement features seemed more adequate than correlation to characterize their coordination.