Effective missing data prediction for collaborative filtering
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business
Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business
Learning to recommend with social trust ensemble
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
A Survey of Human Computation Systems
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
TagRec: Leveraging Tagging Wisdom for Recommendation
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
A social recommendation framework based on multi-scale continuous conditional random fields
Proceedings of the 18th ACM conference on Information and knowledge management
Task search in a human computation market
Proceedings of the ACM SIGKDD Workshop on Human Computation
Task Matching in Crowdsourcing
ITHINGSCPSCOM '11 Proceedings of the 2011 International Conference on Internet of Things and 4th International Conference on Cyber, Physical and Social Computing
TaskRec: probabilistic matrix factorization in task recommendation in crowdsourcing systems
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
In crowdsourcing systems, tasks are distributed to networked people to complete such that a company's production cost can be greatly reduced. Obviously, it is not efficient that the amount of time for a worker spent on selecting a task is comparable with that spent on working on a task, but the monetary reward of a task is just a small amount. The available worker history makes it possible to mine workers' preference on tasks and to provide favorite recommendations. Our exploratory study on the survey results collected from Amazon Mechanical Turk (MTurk) shows that workers' histories can reflect workers' preferences on tasks in crowdsourcing systems. Task recommendation can help workers to find their right tasks faster as well as help requesters to receive good quality output quicker. However, previously proposed classification based task recommendation approach only considers worker performance history, but does not explore worker task searching history. In our paper, we propose a task recommendation framework for task preference modeling and preference-based task recommendation, aiming to recommend tasks to workers who are likely to prefer to work on and provide output that accepted by requesters. We consider both worker performance history and worker task searching history to reflect workers' task preference more accurately. To the best of our knowledge, we are the first to use matrix factorization for task recommendation in crowdsourcing systems.