The Wisdom of Crowds
HT06, tagging paper, taxonomy, Flickr, academic article, to read
Proceedings of the seventeenth conference on Hypertext and hypermedia
Computer
Evaluation campaigns and TRECVid
MIR '06 Proceedings of the 8th ACM international workshop on Multimedia information retrieval
Why we tag: motivations for annotation in mobile and online media
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Watch what I watch: using community activity to understand content
Proceedings of the international workshop on Workshop on multimedia information retrieval
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Computer
Crowdsourcing rock n' roll multimedia retrieval
Proceedings of the international conference on Multimedia
Hi-index | 0.00 |
In this paper, we study social tagging at the video fragment-level using a combination of automated content understanding and the wisdom of the crowds. We are interested in the question whether crowdsourcing can be beneficial to a video search engine that automatically recognizes video fragments on a semantic level. To answer this question, we perform a 3-month online field study with a concert video search engine targeted at a dedicated user-community of pop concert enthusiasts. We harvest the feedback of more than 500 active users and perform two experiments. In experiment 1 we measure user incentive to provide feedback, in experiment 2 we determine the tradeoff between feedback quality and quantity when aggregated over multiple users. Results show that users provide sufficient feedback, which becomes highly reliable when a crowd agreement of 67% is enforced.