Comparing fact finding tasks and user survey for evaluating a video browsing tool

  • Authors:
  • Werner Bailer;Herwig Rehatschek

  • Affiliations:
  • JOANNEUM RESEARCH, Graz, Austria;JOANNEUM RESEARCH, Graz, Austria

  • Venue:
  • MM '09 Proceedings of the 17th ACM international conference on Multimedia
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

There are still no established methods for the evaluation of browsing and exploratory search tools. In the (multimedia) information retrieval community evaluations following the Cranfield paradigm (as e.g. used in TRECVID) have been widely adopted. We have applied two TRECVID style fact finding approaches (retrieval and question answering tasks) and a user survey to the evaluation of a video browsing tool. We analyze the correlation between the results of the different methods, whether different aspects can be evaluated independently with the survey, and if a learning effect can be measured with the different methods. The results show that the retrieval task correlates better with the user experience according to the survey than the question answering tasks. It turns out that the survey rather measures the general user experience while different aspects of the usability cannot be analyzed independently.