Active Gaze Tracking for Human-Robot Interaction
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Active eye contact for human-robot communication
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Bidirectional Eye Contact for Human-Robot Communication
IEICE - Transactions on Information and Systems
A novel non-intrusive eye gaze estimation using cross-ratio under large head motion
Computer Vision and Image Understanding - Special issue on eye detection and tracking
In the Eye of the Beholder: A Survey of Models for Eyes and Gaze
IEEE Transactions on Pattern Analysis and Machine Intelligence
Perception of gaze direction in 2D and 3D facial projections
Proceedings of the SSPNET 2nd International Symposium on Facial Analysis and Animation
Taming Mona Lisa: Communicating gaze faithfully in 2D and 3D facial projections
ACM Transactions on Interactive Intelligent Systems (TiiS)
AH '12 Proceedings of the 3rd Augmented Human International Conference
Communications of the ACM
Animated faces for robotic heads: gaze and beyond
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
Proceedings of the International Working Conference on Advanced Visual Interfaces
Perception of gaze direction for situated interaction
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Design of robot eyes suitable for gaze communication
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
PAPILLON: designing curved display surfaces with printed optics
Proceedings of the 26th annual ACM symposium on User interface software and technology
PAPILLON: expressive eyes for interactive characters
ACM SIGGRAPH 2013 Emerging Technologies
Hi-index | 0.02 |
Reading gaze direction is important in human-robot interactions as it supports, among others, joint attention and non-linguistic interaction. While most previous work focuses on implementing gaze direction reading on the robot, little is known about how the human partner in a human-robot interaction is able to read gaze direction from a robot. The purpose of this paper is twofold: (1) to introduce a new technology to implement robotic face using retro-projected animated faces and (2) to test how well this technology supports gaze reading by humans. We briefly discuss the robot design and discuss parameters influencing the ability to read gaze direction. We present an experiment assessing the user's ability to read gaze direction for a selection of different robotic face designs, using an actual human face as baseline. Results indicate that it is hard to recreate human-human interaction performance. If the robot face is implemented as a semi sphere, performance is worst. While robot faces having a human-like physiognomy and, perhaps surprisingly, video projected on a flat screen perform equally well and seem to suggest that these are the good candidates to implement joint attention in HRI.