The Cranfield tests on index language devices
Readings in information retrieval
Labeling images with a computer game
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Minimal test collections for retrieval evaluation
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
A statistical method for system evaluation using incomplete judgments
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
A simple and efficient sampling method for estimating AP and NDCG
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
The Unreasonable Effectiveness of Data
IEEE Intelligent Systems
Learning to Rank for Information Retrieval
Foundations and Trends in Information Retrieval
CrowdSearch: exploiting crowds for accurate real-time image search on mobile phones
Proceedings of the 8th international conference on Mobile systems, applications, and services
Crowdsourcing for search evaluation
ACM SIGIR Forum
Crowdsourcing 101: putting the WSDM of crowds to work for you
Proceedings of the fourth ACM international conference on Web search and data mining
Crowdsourcing for search and data mining
ACM SIGIR Forum
Crowdsourcing for information retrieval: principles, methods, and applications
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
Crowdsourcing for information retrieval
ACM SIGIR Forum
Crowdsourcing for information retrieval
ACM SIGIR Forum
Quality through flow and immersion: gamifying crowdsourced relevance assessments
SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
ECIR'13 Proceedings of the 35th European conference on Advances in Information Retrieval
Crowdsourcing for information retrieval: introduction to the special issue
Information Retrieval
Crowdsourcing-assisted query structure interpretation
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
The 2nd SIGIR Workshop on Crowdsourcing for Information Retrieval (CIR 2011) was held on July 28, 2011 in Beijing, China, in conjunction with the 34th Annual ACM SIGIR Conference1. The workshop brought together researchers and practitioners to disseminate recent advances in theory, empirical methods, and novel applications of crowdsourcing for information retrieval (IR). The workshop program included three invited talks, a panel discussion entitled Beyond the Lab: State-of-the-Art and Open Challenges in Practical Crowdsourcing, and presentation of nine refereed research papers and one demonstration paper. A Best Paper Award, sponored by Microsoft Bing, was awarded to Jun Wang and Bei Yu for their paper entitled Labeling Images with Queries: A Recall-based Image Retrieval Game Approach. A Crowdsourcing Challenge contest was also announced prior to the workshop, sponsored by CrowdFlower. The contest offered both seed funding and advanced technical support for the winner to use CrowdFlower's services for innovative work. Workshop organizers selected Mark Smucker as the winner based on his proposal entitled: The Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior. Proceedings of the workshop are available online2 [15].