Journal of the American Society for Information Science - Special topic issue on the history of documentation and information science: part II
Users' criteria for relevance evaluation: a cross-situational comparison
Information Processing and Management: an International Journal
Journal of the American Society for Information Science and Technology
Journal of the American Society for Information Science and Technology
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Crowdsourcing for relevance evaluation
ACM SIGIR Forum
IR Evaluation without a Common Set of Topics
ICTIR '09 Proceedings of the 2nd International Conference on Theory of Information Retrieval: Advances in Information Retrieval Theory
Putting the crowd to work in a knowledge-based factory
Advanced Engineering Informatics
A user-tunable approach to marketplace search
Proceedings of the 20th international conference companion on World wide web
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using crowdsourcing for TREC relevance assessment
Information Processing and Management: an International Journal
Hi-index | 0.00 |
We discuss the concept of relevance criteria in the context of e-Commerce search. A vast body of research literature describes the beyond-topical criteria used to determine the relevance of the document to the need. We argue that in an e-Commerce scenario there are some differences, and novel and different criteria can be used to determine relevance. We experimentally validate this hypothesis by means of Amazon Mechanical Turk using a crowdsourcing approach.