An evaluation of product review modalities for mobile phones
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Inferring gender of movie reviewers: exploiting writing style, content and metadata
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Sharing ephemeral information in online social networks: privacy perceptions and behaviours
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part III
Wisdom of artificial crowds algorithm for solving NP-hard problems
International Journal of Bio-Inspired Computation
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
A preliminary analysis of mobile app user reviews
Proceedings of the 24th Australian Computer-Human Interaction Conference
A preliminary analysis of vocabulary in mobile app user reviews
Proceedings of the 24th Australian Computer-Human Interaction Conference
Maater: crowdsourcing to improve online journalism
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Abstract—We present a study of user voting on three websites: Imdb, Amazon and BookCrossings. Here we report on an expert evaluation of the voting mechanisms of each website and a quantitative data analysis of users’ aggregate voting behavior.Our results suggest that the websites with higher barrier to vote introduce a relatively high number of one-off voters, and they appear to attract mostly experts. We also find that one-off voters tend to vote on popular items, while experts mostly vote for obscure, low-rated items.We conclude with design suggestions to address the “wisdom of the crowd” bias.