Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Modelling grounding and discourse obligations using update rules
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Direction of attention perception for conversation initiation in virtual environments
Lecture Notes in Computer Science
Head gestures for perceptual interfaces: The role of context in improving recognition
Artificial Intelligence
Proceedings of the 13th international conference on Intelligent user interfaces
Converting text into agent animations: assigning gestures to text
HLT-NAACL-Short '04 Proceedings of HLT-NAACL 2004: Short Papers
Explorations in engagement for humans and robots
Artificial Intelligence
Learning to predict engagement with a spoken dialog system in open-world settings
SIGDIAL '09 Proceedings of the SIGDIAL 2009 Conference: The 10th Annual Meeting of the Special Interest Group on Discourse and Dialogue
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Automatic analysis of affective postures and body motion to detect engagement with a game companion
Proceedings of the 6th international conference on Human-robot interaction
Estimating a user's conversational engagement based on head pose information
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
Multi-mode saliency dynamics model for analyzing gaze and attention
Proceedings of the Symposium on Eye Tracking Research and Applications
MAWARI: a social interface to reduce the workload of the conversation
ICSR'11 Proceedings of the Third international conference on Social Robotics
Modelling empathy in social robotic companions
UMAP'11 Proceedings of the 19th international conference on Advances in User Modeling
Semantic interpretation of eye movements using designed structures of displayed contents
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Designing engagement-aware agents for multiparty conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Situated multiparty interaction between humans and agents
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Modeling semantic aspects of gaze behavior while catalog browsing
Proceedings of the 15th ACM on International conference on multimodal interaction
Situated multi-modal dialog system in vehicles
Proceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction
Towards affect sensitive and socially perceptive companions
Your Virtual Butler
Hi-index | 0.00 |
In face-to-face conversations, speakers are continuously checking whether the listener is engaged in the conversation and change the conversational strategy if the listener is not fully engaged in the conversation. With the goal of building a conversational agent that can adaptively control conversations with the user, this study analyzes the user's gaze behaviors and proposes a method for estimating whether the user is engaged in the conversation based on gaze transition 3-gram patterns. First, we conduct a Wizard-of-Oz experiment to collect the user's gaze behaviors. Based on the analysis of the gaze data, we propose an engagement estimation method that detects the user's disengagement gaze patterns. The algorithm is implemented as a real-time engagement-judgment mechanism and is incorporated into a multimodal dialogue manager in a conversational agent. The agent estimates the user's conversational engagement and generates probing questions when the user is distracted from the conversation. Finally, we conduct an evaluation experiment using the proposed engagement-sensitive agent and demonstrate that the engagement estimation function improves the user's impression of the agent and the interaction with the agent. In addition, probing performed with proper timing was also found to have a positive effect on user's verbal/nonverbal behaviors in communication with the conversational agent.