Incentives in BitTorrent induce free riding
Proceedings of the 2005 ACM SIGCOMM workshop on Economics of peer-to-peer systems
Virtual trip lines for distributed privacy-preserving traffic monitoring
Proceedings of the 6th international conference on Mobile systems, applications, and services
Proceedings of the 7th international conference on Mobile systems, applications, and services
Financial incentives and the "performance of crowds"
Proceedings of the ACM SIGKDD Workshop on Human Computation
The labor economics of paid crowdsourcing
Proceedings of the 11th ACM conference on Electronic commerce
CrowdSearch: exploiting crowds for accurate real-time image search on mobile phones
Proceedings of the 8th international conference on Mobile systems, applications, and services
Anatomizing application performance differences on smartphones
Proceedings of the 8th international conference on Mobile systems, applications, and services
Dynamic pricing incentive for participatory sensing
Pervasive and Mobile Computing
Designing incentives for inexpert human raters
Proceedings of the ACM 2011 conference on Computer supported cooperative work
A close examination of performance and power characteristics of 4G LTE networks
Proceedings of the 10th international conference on Mobile systems, applications, and services
TUBE: time-dependent pricing for mobile data
Proceedings of the ACM SIGCOMM 2012 conference on Applications, technologies, architectures, and protocols for computer communication
Crowdsourcing to smartphones: incentive mechanism design for mobile phone sensing
Proceedings of the 18th annual international conference on Mobile computing and networking
Using crowdsourcing to support pro-environmental community activism
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
We are becoming increasingly aware that the effectiveness of mobile crowdsourcing systems critically depends on the whims of their human participants, impacting everything from participant engagement to their compliance with the crowdsourced tasks. In response, a number of such systems have started to incorporate different incentive features aimed at a wide range of goals that span from improving participation levels, to extending the systems' coverage, and enhancing the quality of the collected data. Despite the many related efforts, the inclusion of incentives in crowdsourced systems has so far been mostly ad-hoc, treating incentives as a wild-card response fitted for any occasion and goal. Using data from a large, 2-day experiment with 96 participants at a corporate conference, we present an analysis of the impact of two incentive structures on the recruitment, compliance and user effort of a basic mobile crowdsourced service. We build on these preliminary results to argue for a principled approach for selecting incentive and incentive structures to match the variety of requirements of mobile crowdsourcing applications and discuss key issues in working toward that goal.