EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Discriminating the relevance of web search results with measures of pupil size
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Good abandonment in mobile and PC internet search
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Distribution of cognitive load in Web search
Journal of the American Society for Information Science and Technology
Learning user's intent using user tags: intelligent interactive image search system
Proceedings of the sixth international workshop on Exploiting semantic annotations in information retrieval
Hi-index | 0.00 |
This paper proposes a prototype system called Gaze-Learning-Access-and-Search-Engine 0.1 (GLASE), which can perform image relevance ranking based on gaze data and within-session learning. We developed a search user interface that uses an eye-tracker as an input device and employed a relevance re-ranking algorithm based on the gaze length. The preliminary experimental results showed that using our gaze-driven system reduced the task completion time an average of 13.7% in a search session.