Get another label? improving data quality and data mining using multiple, noisy labelers
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Transferring naive bayes classifiers for text classification
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
The Journal of Machine Learning Research
Answering search queries with CrowdSearcher
Proceedings of the 21st international conference on World Wide Web
CDAS: a crowdsourcing data analytics system
Proceedings of the VLDB Endowment
Learning from crowds in the presence of schools of thought
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Whom to ask?: jury selection for decision making tasks on micro-blog services
Proceedings of the VLDB Endowment
CrowdSeed: query processing on microblogs
Proceedings of the 16th International Conference on Extending Database Technology
Hi-index | 0.00 |
Crowd selection is essential to crowd sourcing applications, since choosing the right workers with particular expertise to carry out crowdsourced tasks is extremely important. The central problem is simple but tricky: given a crowdsourced task, who are the most knowledgable users to ask? In this demo, we show our framework that tackles the problem of crowdsourced task assignment on Twitter according to the social activities of its users. Since user profiles on Twitter do not reveal user interests and skills, we transfer the knowledge from categorized Yahoo! Answers datasets for learning user expertise. Then, we select the right crowd for certain tasks based on user expertise. We study the effectiveness of our system using extensive user evaluation. We further engage the attendees to participate a game called--Whom to Ask on Twitter?. This helps understand our ideas in an interactive manner. Our crowd selection can be accessed by the following url http://webproject2.cse.ust.hk:8034/tcrowd/.