SUITOR: an attentive information system
Proceedings of the 5th international conference on Intelligent user interfaces
Implicit feedback for inferring user preference: a bibliography
ACM SIGIR Forum
Eye-tracking analysis of user behavior in WWW search
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Content-based multimedia information retrieval: State of the art and challenges
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
What are you looking for?: an eye-tracking study of information usage in web search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Image retrieval: Ideas, influences, and trends of the new age
ACM Computing Surveys (CSUR)
The 2005 PASCAL visual object classes challenge
MLCW'05 Proceedings of the First international conference on Machine Learning Challenges: evaluating Predictive Uncertainty Visual Object Classification, and Recognizing Textual Entailment
PicSOM-self-organizing image retrieval with MPEG-7 content descriptors
IEEE Transactions on Neural Networks
GaZIR: gaze-based zooming interface for image retrieval
Proceedings of the 2009 international conference on Multimodal interfaces
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Image ranking with implicit feedback from eye movements
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Finding the user's interest level from their eyes
Proceedings of the 2010 ACM workshop on Social, adaptive and personalized multimedia interaction and access
A new focus on end users: eye-tracking analysis for digital libraries
ECDL'10 Proceedings of the 14th European conference on Research and advanced technology for digital libraries
An eye-tracking-based approach to facilitate interactive video search
Proceedings of the 1st ACM International Conference on Multimedia Retrieval
Gaze movement inference for user adapted image annotation and retrieval
SBNMA '11 Proceedings of the 2011 ACM workshop on Social and behavioural networked media access
Identifying objects in images from analyzing the users' gaze movements for provided tags
MMM'12 Proceedings of the 18th international conference on Advances in Multimedia Modeling
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Creation of individual photo selections: read preferences from the users' eyes
Proceedings of the 21st ACM international conference on Multimedia
Tagging-by-search: automatic image region labeling using gaze information obtained from image search
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Query formulation and efficient navigation through data to reach relevant results are undoubtedly major challenges for image or video retrieval. Queries of good quality are typically not available and the search process needs to rely on relevance feedback given by the user, which makes the search process iterative. Giving explicit relevance feedback is laborious, not always easy, and may even be impossible in ubiquitous computing scenarios. A central question then is: Is it possible to replace or complement scarce explicit feedback with implicit feedback inferred from various sensors not specifically designed for the task? In this paper, we present preliminary results on inferring the relevance of images based on implicit feedback about users' attention, measured using an eye tracking device. It is shown that, in reasonably controlled setups at least, already fairly simple features and classifiers are capable of detecting the relevance based on eye movements alone, without using any explicit feedback.