Hidden Markov Models for Speech Recognition
Hidden Markov Models for Speech Recognition
Automatic Handwriting Gestures Recognition Using Hidden Markov Models
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Conversational scene analysis
Automatic Analysis of Multimodal Group Actions in Meetings
IEEE Transactions on Pattern Analysis and Machine Intelligence
Queue - HCI
Automatic detection of group functional roles in face to face interactions
Proceedings of the 8th international conference on Multimodal interfaces
How smart are our environments? An updated look at the state of the art
Pervasive and Mobile Computing
Extraction of important interactions in medical interviewsusing nonverbal information
Proceedings of the 9th international conference on Multimodal interfaces
Using the influence model to recognize functional roles in meetings
Proceedings of the 9th international conference on Multimodal interfaces
Multimodal support to group dynamics
Personal and Ubiquitous Computing - Special Issue: User-centred design and evaluation of ubiquitous groupware
Body posture identification using hidden Markov model with a wearable sensor network
BodyNets '08 Proceedings of the ICST 3rd international conference on Body area networks
Detecting small group activities from multimodal observations
Applied Intelligence
Review: Ambient intelligence: Technologies, applications, and opportunities
Pervasive and Mobile Computing
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
Automatic nonverbal analysis of social interaction in small groups: A review
Image and Vision Computing
Probabilistic models for concurrent chatting activity recognition
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
International Journal of Technology Enhanced Learning
IEEE Transactions on Information Technology in Biomedicine
Mining and monitoring patterns of daily routines for assisted living in real world settings
Proceedings of the 1st ACM International Health Informatics Symposium
Probabilistic models for concurrent chatting activity recognition
ACM Transactions on Intelligent Systems and Technology (TIST)
Multimodal support for social dynamics in co-located meetings
Personal and Ubiquitous Computing
Multimodal sensing, recognizing and browsing group social dynamics
Personal and Ubiquitous Computing
Detecting F-formations as dominant sets
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Extracting activities from multimodal observation
KES'06 Proceedings of the 10th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part II
A lightweight speech detection system for perceptive environments
MLMI'06 Proceedings of the Third international conference on Machine Learning for Multimodal Interaction
Audio-visual fused online context analysis toward smart meeting room
UIC'07 Proceedings of the 4th international conference on Ubiquitous Intelligence and Computing
SocioPhone: everyday face-to-face interaction monitoring platform using multi-phone sensor fusion
Proceeding of the 11th annual international conference on Mobile systems, applications, and services
ACM Transactions on Intelligent Systems and Technology (TIST) - Survey papers, special sections on the semantic adaptive social web, intelligent systems for health informatics, regular papers
Hi-index | 0.00 |
This paper addresses the problem of detecting interaction groups in an intelligent environment. To understand human activity, we need to identify human actors and their interpersonal links. An interaction group can be seen as basic entity, within which individuals collaborate in order to achieve a common goal. In this regard, the dynamic change of interaction group configuration, i.e. the split and merge of interaction groups, can be seen as indicator of new activities. Our approach takes speech activity detection of individuals forming interaction groups as input. A classical HMM-based approach learning different HMM for the different group configurations did not produce promising results. We propose an approach for detecting interaction group configurations based on the assumption that conversational turn taking is synchronized inside groups. The proposed detector is based on one HMM constructed upon conversational hypotheses. The approach shows good results and thus confirms our conversational hypotheses.