Modeling of crowdsourcing platforms and granularity of work organization in future internet
Proceedings of the 23rd International Teletraffic Congress
Serf and turf: crowdturfing for fun and profit
Proceedings of the 21st international conference on World Wide Web
An analysis of human factors and label accuracy in crowdsourcing relevance judgments
Information Retrieval
Proceedings of the 2nd ACM international workshop on Crowdsourcing for multimedia
Internet video delivery in youtube: from traffic measurements to quality of experience
DataTraffic Monitoring and Analysis
Hi-index | 0.00 |
Since Jeff Howe introduced the term "crowdsourcing" in 2006 for the first time, crowd sourcing has be come a growing market in the current Internet. Thousands of workers categorize images, write articles or perform other small tasks on platforms like Amazon Mechanical Turk (MTurk), Micro workers or Short Task. In this work, we want to give an inside view of the usage data from Micro workers and show that there are significant differences to the well studied MTurk. Further, we have a look at Micro workers from the perspective for a worker, an employer and the platform owner, in order to answer their most important questions: What jobs are most paid? How do I get my work done most quickly? When are the users of my platform active?