Designing the user interface (videotape)
Designing the user interface (videotape)
Usability inspection methods
Enhancing the explanatory power of usability heuristics
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Guidelines for enterprise-wide GUI design
Guidelines for enterprise-wide GUI design
User and task analysis for interface design
User and task analysis for interface design
The usability engineering lifecycle: a practitioner's handbook for user interface design
The usability engineering lifecycle: a practitioner's handbook for user interface design
Guidelines for using multiple views in information visualization
AVI '00 Proceedings of the working conference on Advanced visual interfaces
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations
VL '96 Proceedings of the 1996 IEEE Symposium on Visual Languages
The challenge of information visualization evaluation
Proceedings of the working conference on Advanced visual interfaces
BEST PAPER: A Knowledge Task-Based Framework for Design and Evaluation of Information Visualizations
INFOVIS '04 Proceedings of the IEEE Symposium on Information Visualization
Beyond Guidelines: What Can We Learn from the Visual Information Seeking Mantra?
IV '05 Proceedings of the Ninth International Conference on Information Visualisation
Evaluating Visualizations: Do Expert Reviews Work?
IEEE Computer Graphics and Applications
Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
Heuristics for information visualization evaluation
Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
VAST 2007 Contest - Blue Iguanodon
VAST '07 Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology
Analyzing a socio-technical visualization tool using usability inspection methods
VLHCC '08 Proceedings of the 2008 IEEE Symposium on Visual Languages and Human-Centric Computing
Advancing user-centered evaluation of visual analytic environments through contests
Information Visualization
The science of analytic reporting
Information Visualization
Visual analytics technology transition progress
Information Visualization
An heuristic set for evaluation in information visualization
Proceedings of the International Conference on Advanced Visual Interfaces
Designing for Situation Awareness: An Approach to User-Centered Design, Second Edition
Designing for Situation Awareness: An Approach to User-Centered Design, Second Edition
Interaction junk: user interaction-based evaluation of visual analytic systems
Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization
Hi-index | 0.00 |
In this article, we develop guidelines for evaluating visual analytics environments based on a synthesis of reviews for the entries to the 2009 Visual AnaLytics Science and Technology [VAST] Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews, we gained a better understanding of what is important to our reviewers, both visual.ization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We also looked at guidelines developed by researchers in various domains and synthesized the results from these three efforts into an initial set for use by others in the community. One challenge for future visual analytics systems is to help in the generation of reports. In our user study, we also worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 From these two efforts, we produced some initial guidelines for evaluating visual analytics environments and for the evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope as the visual analytics systems we evaluated were used in specific tasks. We propose these guidelines as a starting point for the Visual Analytics Community.