Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Direction of attention perception for conversation initiation in virtual environments
Lecture Notes in Computer Science
Head gestures for perceptual interfaces: The role of context in improving recognition
Artificial Intelligence
Explorations in engagement for humans and robots
Artificial Intelligence
Learning to predict engagement with a spoken dialog system in open-world settings
SIGDIAL '09 Proceedings of the SIGDIAL 2009 Conference: The 10th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Estimating user's engagement from eye-gaze behaviors in human-agent conversations
Proceedings of the 15th international conference on Intelligent user interfaces
Recognizing engagement in human-robot interaction
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
With the goal of building an intelligent conversational agent that can recognize the user's engagement, this paper proposes a method of judging a user's conversational engagement based on head pose data. First, we analyzed how head pose information is correlated with the user's conversational engagement and found that the amplitude of head movement and rotation have a moderate positive correlation with the level of conversational engagement. We then established an engagement estimation model by applying a decision tree learning algorithm to 19 parameters. The results showed that the proposed model based on head pose information performs quite well.