Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Non-Intrusive Gaze Tracking Using Artificial Neural Networks
Non-Intrusive Gaze Tracking Using Artificial Neural Networks
Low-cost gaze interaction: ready to deliver the promises
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Adaptive eye-gaze-guided interfaces: design & performance evaluation
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Evaluation of a remote webcam-based eye tracker
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Identifying usability issues via algorithmic detection of excessive visual search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability evaluation of eye tracking on an unmodified common tablet
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hybrid method based on topography for robust detection of iris center and eye corners
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Hi-index | 0.00 |
An eye-gaze-guided computer interface could enable computer use by the seriously disabled but existing systems cost tens of thousands of dollars or have cumbersome setups. This paper presents a methodology for real-time eye gaze tracking using a standard webcam without the need for hardware modification or special placement. An artificial neural network was employed to estimate the location of the user's gaze based on an image of the user's eye, mimicking the way that humans determine where another person is looking. Accuracy measurements and usability experiments were performed using a laptop computer with a webcam built into the screen. The results show this approach to be promising for the development of usable eye tracking systems using standard webcams, particularly those built into many laptop computers.