Using collaborative filtering to weave an information tapestry
Communications of the ACM - Special issue on information filtering
Finding the flow in web site search
Communications of the ACM
Labeling images with a computer game
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Zoetrope: interacting with the ephemeral web
Proceedings of the 21st annual ACM symposium on User interface software and technology
Financial incentives and the "performance of crowds"
Proceedings of the ACM SIGKDD Workshop on Human Computation
Crowdsourcing, attention and productivity
Journal of Information Science
The labor economics of paid crowdsourcing
Proceedings of the 11th ACM conference on Electronic commerce
Analyzing the Amazon Mechanical Turk marketplace
XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter
XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter
Ethics and tactics of professional crowdwork
XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter
Putting out a HIT: crowdsourcing malware installs
WOOT'11 Proceedings of the 5th USENIX conference on Offensive technologies
Crowds in two seconds: enabling realtime crowd-powered interfaces
Proceedings of the 24th annual ACM symposium on User interface software and technology
Answering search queries with CrowdSearcher
Proceedings of the 21st international conference on World Wide Web
Micro perceptual human computation for visual tasks
ACM Transactions on Graphics (TOG)
Medusa: a programming framework for crowd-sensing applications
Proceedings of the 10th international conference on Mobile systems, applications, and services
Extending search to crowds: a model-driven approach
Search Computing
TaskRec: probabilistic matrix factorization in task recommendation in crowdsourcing systems
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Proceedings of the 2013 conference on Computer supported cooperative work
Auction-based crowdsourcing supporting skill management
Information Systems
Task recommendation in crowdsourcing systems
Proceedings of the First International Workshop on Crowdsourcing and Data Mining
Implementing crowdsourcing-based relevance experimentation: an industrial perspective
Information Retrieval
Hi-index | 0.00 |
In order to understand how a labor market for human computation functions, it is important to know how workers search for tasks. This paper uses two complementary methods to gain insight into how workers search for tasks on Mechanical Turk. First, we perform a high frequency scrape of 36 pages of search results and analyze it by looking at the rate of disappearance of tasks across key ways Mechanical Turk allows workers to sort tasks. Second, we present the results of a survey in which we paid workers for self-reported information about how they search for tasks. Our main findings are that on a large scale, workers sort by which tasks are most recently posted and which have the largest number of tasks available. Furthermore, we find that workers look mostly at the first page of the most recently posted tasks and the first two pages of the tasks with the most available instances but in both categories the position on the result page is unimportant to workers. We observe that at least some employers try to manipulate the position of their task in the search results to exploit the tendency to search for recently posted tasks. On an individual level, we observed workers searching by almost all the possible categories and looking more than 10 pages deep. For a task we posted to Mechanical Turk, we confirmed that a favorable position in the search results do matter: our task with favorable positioning was completed 30 times faster and for less money than when its position was unfavorable.