The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Social Constraints on Animate Vision
IEEE Intelligent Systems
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Mobile eye tracking as a basis for real-time control of a gaze driven head-mounted video camera
Proceedings of the 2006 symposium on Eye tracking research & applications
Development of an android robot for studying human-robot interaction
IEA/AIE'2004 Proceedings of the 17th international conference on Innovations in applied artificial intelligence
The Agile Stereo Pair for active vision
Machine Vision and Applications
Android as a telecommunication medium with a human-like presence
Proceedings of the ACM/IEEE international conference on Human-robot interaction
Active vision for sociable robots
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Perception as a key component for cognitive technical systems
Pattern Recognition and Image Analysis
Hi-index | 0.00 |
A novel paradigm for the evaluation of human-robot interaction is proposed, with special focus on the importance of natural eye and head movements in nonverbal human-machine communication scenarios. We present an experimental platform that will enable Wizard-of-Oz experiments in which a human experimenter (wizard) teleoperates a robotic head and eyes with his own head and eyes. Since the robot is animated based on the nonverbal behaviors of the human experimenter, the whole range of human eye movements can be presented without having to implement a complete gaze behavior model first. The experimenter watches and reacts to the video stream of the participant who directly interacts with the robot. Results are presented that focus on those technical aspects of the experimental platform that enable real-time and humanlike interaction capabilities. In particular, the tracking of ocular motor dynamics, its replication in a robotic active vision system, and the involved teleoperation delays are evaluated. This setup will help to answer the question of which aspects of human gaze and head movement behavior have to be implemented to achieve humanness in active vision systems of robots.