Frame-buffer display architectures
Human-computer interaction
Multi-modal HCI: combination of gesture and speech recognition
CHI '93 INTERACT '93 and CHI '93 Conference Companion on Human Factors in Computing Systems
Visual tracking for multimodal human computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Real-time Tracking of Face Features and Gaze Direction Determination
WACV '98 Proceedings of the 4th IEEE Workshop on Applications of Computer Vision (WACV'98)
Gaze Tracking for Multimodal Human-Computer Interaction
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97) -Volume 4 - Volume 4
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
A real-time gaze-tracking system that estimates the user's eye gaze and computes the window of focused view on a computer monitor has been developed. This artificial neural network based system can be trained and customized for an individual. Unlike existing systems in which skin color features and/or other mountable equipment are needed, this system is based on a simple non-intrusive camera mounted on the monitor. Gaze point is accurately estimated within a 1 in. on a 19-in. monitor with a CCD camera having a 640 × 480 image resolution. The system performance is independent of user's forward and backward as well as upward and downward movements. The gaze-tracking system implementation and the factors affecting its performance are discussed and analyzed in detail. The features and implementation methods that make this system real-time are also explained.