Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Explorations in engagement for humans and robots
Artificial Intelligence
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
IVA '09 Proceedings of the 9th International Conference on Intelligent Virtual Agents
A multimodal real-time platform for studying human-avatar interactions
IVA'10 Proceedings of the 10th international conference on Intelligent virtual agents
Real-time adaptive behaviors in multimodal human-avatar interactions
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction
Analysis on learners' gaze patterns and the instructor's reactions in ballroom dance tutoring
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Gaze and turn-taking behavior in casual conversational interactions
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special issue on interaction with smart objects, Special section on eye gaze and conversation
Guest Editorial: Gesture and speech in interaction: An overview
Speech Communication
Hi-index | 0.00 |
In face-to-face conversations, speakers are continuously checking whether the listener is engaged in the conversation. When the listener is not fully engaged in the conversation, the speaker changes the conversational contents or strategies. With the goal of building a conversational agent that can control conversations with the user in such an adaptive way, this study analyzes the user's gaze behaviors and proposes a method for predicting whether the user is engaged in the conversation based on gaze transition 3-Gram patterns. First, we conducted a Wizard-of-Oz experiment to collect the user's gaze behaviors as well as the user's subjective reports and an observer's judgment concerning the user's interest in the conversation. Next, we proposed an engagement estimation algorithm that estimates the user's degree of engagement from gaze transition patterns. This method takes account of individual differences in gaze patterns. The algorithm is implemented as a real-time engagement-judgment mechanism, and the results of our evaluation experiment showed that our method can predict the user's conversational engagement quite well.