The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Implicit feedback for inferring user preference: a bibliography
ACM SIGIR Forum
Information Visualization: Perception for Design
Information Visualization: Perception for Design
Combining eye movements and collaborative filtering for proactive information retrieval
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Image retrieval: Ideas, influences, and trends of the new age
ACM Computing Surveys (CSUR)
Eye movements as implicit relevance feedback
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Perceptual image retrieval using eye movements
International Journal of Computer Mathematics - Computer Vision and Pattern Recognition
The MIR flickr retrieval evaluation
MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
Can relevance of images be inferred from eye movements?
MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
Low-cost gaze interaction: ready to deliver the promises
CHI '09 Extended Abstracts on Human Factors in Computing Systems
PicSOM-self-organizing image retrieval with MPEG-7 content descriptors
IEEE Transactions on Neural Networks
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Finding the user's interest level from their eyes
Proceedings of the 2010 ACM workshop on Social, adaptive and personalized multimedia interaction and access
An eye-tracking-based approach to facilitate interactive video search
Proceedings of the 1st ACM International Conference on Multimedia Retrieval
Probabilistic proactive timeline browser
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Gaze movement inference for user adapted image annotation and retrieval
SBNMA '11 Proceedings of the 2011 ACM workshop on Social and behavioural networked media access
Identifying objects in images from analyzing the users' gaze movements for provided tags
MMM'12 Proceedings of the 18th international conference on Advances in Multimedia Modeling
Learning relevance from natural eye movements in pervasive interfaces
Proceedings of the 14th ACM international conference on Multimodal interaction
Unsupervised inference of auditory attention from biosensors
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part II
Creation of individual photo selections: read preferences from the users' eyes
Proceedings of the 21st ACM international conference on Multimedia
Can a clipboard improve user interaction and user experience in web-based image search?
HCI International'13 Proceedings of the 15th international conference on Human Interface and the Management of Information: information and interaction design - Volume Part I
Tagging-by-search: automatic image region labeling using gaze information obtained from image search
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
We introduce GaZIR, a gaze-based interface for browsing and searching for images. The system computes on-line predictions of relevance of images based on implicit feedback, and when the user zooms in, the images predicted to be the most relevant are brought out. The key novelty is that the relevance feedback is inferred from implicit cues obtained in real-time from the gaze pattern, using an estimator learned during a separate training phase. The natural zooming interface can be connected to any content-based information retrieval engine operating on user feedback. We show with experiments on one engine that there is sufficient amount of information in the gaze patterns to make the estimated relevance feedback a viable choice to complement or even replace explicit feedback by pointing-and-clicking.