Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Explaining effects of eye gaze on mediated group conversations:: amount or synchronization?
CSCW '02 Proceedings of the 2002 ACM conference on Computer supported cooperative work
Providing the basis for human-robot-interaction: a multi-modal attention system for a mobile robot
Proceedings of the 5th international conference on Multimodal interfaces
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Learning to Detect Objects in Images via a Sparse, Part-Based Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Identifying the addressee in human-human-robot interactions based on head pose and speech
Proceedings of the 6th international conference on Multimodal interfaces
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Direction of attention perception for conversation initiation in virtual environments
Lecture Notes in Computer Science
A model of attention and interest using Gaze behavior
Lecture Notes in Computer Science
Recognizing gaze aversion gestures in embodied conversational discourse
Proceedings of the 8th international conference on Multimodal interfaces
Museum guide robot based on sociological interaction analysis
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
How close?: model of proximity control for information-presenting robots
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Precision timing in human-robot interaction: coordination of head movement and utterance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Footing in human-robot conversations: how robots might shape participant roles using gaze cues
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Revealing Gauguin: engaging visitors in robot guide's explanation in an art museum
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 3rd ACM International Workshop on Context-Awareness for Self-Managing Systems
Explorations in engagement for humans and robots
Artificial Intelligence
Detecting user engagement with a robot companion using task and social interaction-based features
Proceedings of the 2009 international conference on Multimodal interfaces
Learning to predict engagement with a spoken dialog system in open-world settings
SIGDIAL '09 Proceedings of the SIGDIAL 2009 Conference: The 10th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Receptionist or information kiosk: how do people talk with a robot?
Proceedings of the 2010 ACM conference on Computer supported cooperative work
Estimating user's engagement from eye-gaze behaviors in human-agent conversations
Proceedings of the 15th international conference on Intelligent user interfaces
Reconfiguring spatial formation arrangement by robot body orientation
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Recognizing engagement in human-robot interaction
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
A model of proximity control for information-presenting robots
IEEE Transactions on Robotics
Facilitating multiparty dialog with gaze, gesture, and speech
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction
Conversational gaze mechanisms for humanlike robots
ACM Transactions on Interactive Intelligent Systems (TiiS)
Multiparty turn taking in situated dialog: study, lessons, and directions
SIGDIAL '11 Proceedings of the SIGDIAL 2011 Conference
Robot behavior toolkit: generating effective social behaviors for robots
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Pay attention!: designing adaptive agents that monitor and improve user engagement
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing effective gaze mechanisms for virtual agents
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.01 |
Recognizing users' engagement state and intentions is a pressing task for computational agents to facilitate fluid conversations in situated interactions. We investigate how to quantitatively evaluate high-level user engagement and intentions based on low-level visual cues, and how to design engagement-aware behaviors for the conversational agents to behave in a sociable manner. Drawing on machine learning techniques, we propose two computational models to quantify users' attention saliency and engagement intentions. Their performances are validated by a close match between the predicted values and the ground truth annotation data. Next, we design a novel engagement-aware behavior model for the agent to adjust its direction of attention and manage the conversational floor based on the estimated users' engagement. In a user study, we evaluated the agent's behaviors in a multiparty dialog scenario. The results show that the agent's engagement-aware behaviors significantly improved the effectiveness of communication and positively affected users' experience.