Quantifying interpersonal influence in face-to-face conversations based on visual attention patterns
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Using the influence model to recognize functional roles in meetings
Proceedings of the 9th international conference on Multimodal interfaces
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Multimodal recognition of personality traits in social interactions
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Modeling dominance in group conversations using nonverbal activity cues
IEEE Transactions on Audio, Speech, and Language Processing - Special issue on multimodal processing in speech-based interactions
The AMI meeting corpus: a pre-announcement
MLMI'05 Proceedings of the Second international conference on Machine Learning for Multimodal Interaction
Dominance detection in meetings using easily obtainable features
MLMI'05 Proceedings of the Second international conference on Machine Learning for Multimodal Interaction
IEEE Transactions on Multimedia
Predicting remote versus collocated group interactions using nonverbal cues
Proceedings of the ICMI-MLMI '09 Workshop on Multimodal Sensor-Based Systems and Mobile Phones for Social Computing
Classification of patient case discussions through analysis of vocalisation graphs
Proceedings of the 2009 international conference on Multimodal interfaces
Identifying emergent leadership in small groups using nonverbal communicative cues
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Hi-index | 0.00 |
This paper addresses the novel problem of characterizing conversational group dynamics. It is well documented in social psychology that depending on the objectives a group, the dynamics are different. For example, a competitive meeting has a different objective from that of a collaborative meeting. We propose a method to characterize group dynamics based on the joint description of a group members' aggregated acoustical nonverbal behaviour to classify two meeting datasets (one being cooperative-type and the other being competitive-type). We use 4.5 hours of real behavioural multi-party data and show that our methodology can achieve a classification rate of upto 100%.