Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
Restricted Focus Viewer: A Tool for Tracking Visual Attention
Diagrams '00 Proceedings of the First International Conference on Theory and Application of Diagrams
Optimizing search engines using clickthrough data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Improving web search ranking by incorporating user behavior information
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye tracking and online search: Lessons learned and challenges ahead
Journal of the American Society for Information Science and Technology
Crowdsourcing for relevance evaluation
ACM SIGIR Forum
Inferring search behaviors using partially observable Markov (POM) model
Proceedings of the third ACM international conference on Web search and data mining
Proceedings of the 19th international conference on World wide web
Aspects of augmented social cognition: social information foraging and social search
OCSC'07 Proceedings of the 2nd international conference on Online communities and social computing
No clicks, no problem: using cursor movements to understand and improve search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Web search behavior studies, including eye-tracking studies of search result examination, have resulted in numerous insights to improve search result quality and presentation. Yet, these studies have been severely restricted in scale, due to the expense and effort required. We propose a novel methodology for crowdsourcing web search behavior studies - specifically focusing on performing large-scale studies of result examination behavior. We present a viewport-based examination interface (ViewSer), which enables remotely tracking searcher examination behavior, without requiring eye tracking equipment. We show that ViewSer induces similar viewing and clickthrough behavior, compared to in-lab users monitored with eye tracking, in a study with over 100 remote participants. ViewSer is a first step towards large-scale behavioral evaluation of web search, which would help improve web search result presentation, result ranking, and ultimately improve the web search experience overall.