What makes Web sites credible?: a report on a large quantitative study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Visualization components for persistent conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Persuasive Technology: Using Computers to Change What We Think and Do
Persuasive Technology: Using Computers to Change What We Think and Do
Prominence-interpretation theory: explaining how people assess credibility online
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Developing Legible Visualizations for Online Social Spaces
HICSS '02 Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS'02)-Volume 4 - Volume 4
Clutter or content?: how on-screen enhancements affect how TV viewers scan and what they learn
Proceedings of the 2006 symposium on Eye tracking research & applications
Information Dashboard Design: The Effective Visual Communication of Data
Information Dashboard Design: The Effective Visual Communication of Data
A content-driven reputation system for the wikipedia
Proceedings of the 16th international conference on World Wide Web
It's All News to Me: The Effect of Instruments on Ratings Provision
HICSS '07 Proceedings of the 40th Annual Hawaii International Conference on System Sciences
Social Information Processing in News Aggregation
IEEE Internet Computing
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lifting the veil: improving accountability and social transparency in Wikipedia with wikidashboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Information quality work organization in wikipedia
Journal of the American Society for Information Science and Technology
Get another label? improving data quality and data mining using multiple, noisy labelers
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Honest Signals: How They Shape Our World
Honest Signals: How They Shape Our World
Can social annotation support users in evaluating the trustworthiness of video clips?
Proceedings of the 2nd ACM workshop on Information credibility on the web
Can you ever trust a wiki?: impacting perceived trustworthiness in wikipedia
Proceedings of the 2008 ACM conference on Computer supported cooperative work
An annotation model for making sense of information quality in online video
Proceedings of the 3rd International Conference on the Pragmatic Web: Innovating the Interactive Society
Videolyzer: quality analysis of online informational video for bloggers and journalists
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Statement map: assisting information crediblity analysis by visualizing arguments
Proceedings of the 3rd workshop on Information credibility on the web
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Tweet the debates: understanding community annotation of uncollected sources
WSM '09 Proceedings of the first SIGMM workshop on Social media
Characterizing debate performance via aggregated twitter sentiment
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Who are the crowdworkers?: shifting demographics in mechanical turk
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Assigning trust to Wikipedia content
WikiSym '08 Proceedings of the 4th International Symposium on Wikis
Credibility-inspired ranking for blog post retrieval
Information Retrieval
Hi-index | 0.00 |
In this work we develop and evaluate a method for the syndication and visualization of aggregate quality evaluations of informational video. We enable the sharing of knowledge between motivated media watchdogs and a wider population of casual users. We do this by developing simple visual cues which indicate aggregated activity levels and polarity of quality evaluations (i.e. positive / negative) which are presented in-line with videos as they play. In an experiment we show the potential of these visuals to engender constructive changes to the credibility of informational video under some circumstances. We discuss the limitations, and future work associated with this approach toward video credibility modulation.