Implicit feedback for inferring user preference: a bibliography
ACM SIGIR Forum
Eye-tracking analysis of user behavior in WWW search
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Attention-based information retrieval using eye tracker data
Proceedings of the 3rd international conference on Knowledge capture
Using visual attention to extract regions of interest in the context of image retrieval
Proceedings of the 44th annual Southeast regional conference
Eye tracking -- A new interface for visual exploration
BT Technology Journal
Attention-based information retrieval
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
Perceptual image retrieval using eye movements
International Journal of Computer Mathematics - Computer Vision and Pattern Recognition
A user-oriented webpage ranking algorithm based on user attention time
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
An eye-tracking-based approach to facilitate interactive video search
Proceedings of the 1st ACM International Conference on Multimedia Retrieval
Probabilistic proactive timeline browser
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
A survey of semantic multimedia retrieval systems
MACMESE'11 Proceedings of the 13th WSEAS international conference on Mathematical and computational methods in science and engineering
Content based recommender system by using eye gaze data
Proceedings of the Symposium on Eye Tracking Research and Applications
Hi-index | 0.00 |
In this paper we propose an implicit relevance feedback method with the aim to improve the performance of known Content Based Image Retrieval (CBIR) systems by re-ranking the retrieved images according to users' eye gaze data. This represents a new mechanism for implicit relevance feedback, in fact usually the sources taken into account for image retrieval are based on the natural behavior of the user in his/her environment estimated by analyzing mouse and keyboard interactions. In detail, after the retrieval of the images by querying CBIRs with a keyword, our system computes the most salient regions (where users look with a greater interest) of the retrieved images by gathering data from an unobtrusive eye tracker, such as Tobii T60. According to the features, in terms of color, texture, of these relevant regions our system is able to re-rank the images, initially, retrieved by the CBIR. Performance evaluation, carried out on a set of 30 users by using Google Images and "pyramid" like keyword, shows that about the 87% of the users is more satisfied of the output images when the re-raking is applied.