Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval
Hedonic and ergonomic quality aspects determine a software's appeal
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Successful approaches in the TREC video retrieval evaluations
Proceedings of the 12th annual ACM international conference on Multimedia
The Open Video Digital Library: A Möbius strip of research and practice
Journal of the American Society for Information Science and Technology
Evaluation campaigns and TRECVid
MIR '06 Proceedings of the 8th ACM international workshop on Multimedia information retrieval
Establishing the utility of non-text search for news video retrieval with real world users
Proceedings of the 15th international conference on Multimedia
Measuring the impact of temporal context on video retrieval
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
Supporting video library exploratory search: when storyboards are not enough
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
Balancing thread based navigation for targeted video search
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
VisionGo: bridging users and multimedia video retrieval
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
Annotation of heterogeneous multimedia content using automatic speech recognition
SAMT'07 Proceedings of the semantic and digital media technologies 2nd international conference on Semantic Multimedia
Mining novice user activity with TRECVID interactive retrieval tasks
CIVR'06 Proceedings of the 5th international conference on Image and Video Retrieval
Promoting temporal diversity in video search results
RIAO '10 Adaptivity, Personalization and Fusion of Heterogeneous Information
Differences in video search behavior between novices and archivists
AMR'10 Proceedings of the 8th international conference on Adaptive Multimedia Retrieval: context, exploration, and fusion
Hi-index | 0.00 |
In this paper, we describe one of the largest multi-site interactive video retrieval experiments conducted in a laboratory setting. Interactive video retrieval performance is difficult to cross-compare as variables exist across users, interfaces and the underlying retrieval engine. Conducted within the framework of TRECVID 2008, we completed a multi-site, multi-interface experiment. Three institutes participated involving 36 users, 12 each from Dublin City University (DCU, Ireland), University of Glasgow (GU, Scotland) and Centrum Wiskunde & Informatica (CWI, the Netherlands). Three user interfaces were developed which all used the same search service. Using a latin squares arrangement, each user completed 12 topics, leading to 6 TRECVID runs per site, 18 in total. This allowed us to isolate the factors of users and interfaces from retrieval performance. In this paper we present an analysis of both the quantitative and qualitative data generated from this experiment, demonstrating that for interactive video retrieval with "novice" users, performance can vary by up to 300% for the same system using different sets of users, whilst differences in performance of interface variants was in comparison not statistically different. Our results have implications for the manner in which interactive video retrieval experiments using non-expert users are evaluated. The primary focus of this paper is in highlighting that non-expert users generate very large performance fluctuations, which may either mask or create system variability. The discussion of why this happened is not covered by this paper.