TRICLOPS: a tool for studying active vision
International Journal of Computer Vision - Special issue on active vision II
Real-Time Stereo Tracking for Head Pose and Gaze Estimation
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
An Algorithm for Real-Time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Gaze Tracking for Multimodal Human-Computer Interaction
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Interactive skills using active gaze tracking
Proceedings of the 5th international conference on Multimodal interfaces
The Agile Stereo Pair for active vision
Machine Vision and Applications
A Multi-functional Entertaining and Educational Robot
Journal of Intelligent and Robotic Systems
Using liquid lenses to extend the operating range of a remote gaze tracking system
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
A study of a retro-projected robotic face and its effectiveness for gaze reading by humans
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Teleoperation through eye gaze (TeleGaze): a multimodal approach
ROBIO'09 Proceedings of the 2009 international conference on Robotics and biomimetics
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Hi-index | 0.00 |
In our effort to make human-robot interfaces more user-friendly, we built an active gaze tracking system that can measure a person's gaze direction in real-time. Gaze normally tells which object in his/her surrounding a person is interested in. Therefore, it can be used as a medium for human-robot interaction like instructing a robot arm to pick a certain object a user is looking at. In this paper, we discuss how we developed and put together algorithms for zoom camera calibration, low-level control of active head, face and gaze tracking to create an active gaze tracking system.