CrowdSeed: query processing on microblogs

  • Authors:
  • Zhou Zhao;Wilfred Ng;Zhijun Zhang

  • Affiliations:
  • The Hong Kong University of Science and Technology Hong Kong, China;The Hong Kong University of Science and Technology Hong Kong, China;The Hong Kong University of Science and Technology Hong Kong, China

  • Venue:
  • Proceedings of the 16th International Conference on Extending Database Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Databases often offer poor answers with respect to judgemental queries such as asking the best among the movies shown in recent months. Processing such queries requires human input for providing missing information in order to clarify uncertainty or inconsistency in queries. Nowadays, it is common to see people seeking answers on micro-blogs through asking or sharing questions with their friends. This can be easily done via smart phones, which diffuse a question to a large number of users through message propagation in microblogs. This trend is important and known as CrowdSearch. Due to conflicting attitudes among crowds, the majority vote is employed as a crowd-wisdom aggregation schema. In this demo, we show the problem of minimizing the monetary cost of a crowdsourced query, given the specified expected accuracy of the aggregated answer. We present CrowdSeed, a system that automatically integrates human input for processing queries imposed on microblogs. We demonstrate the effectiveness and efficiency of our system using real world data, as well as presenting interesting results from a game called "Who is in the CrowdSeed?".