Usability inspection methods
User and task analysis for interface design
User and task analysis for interface design
The usability engineering lifecycle: a practitioner's handbook for user interface design
The usability engineering lifecycle: a practitioner's handbook for user interface design
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Evaluating Visualizations: Do Expert Reviews Work?
IEEE Computer Graphics and Applications
Evaluating information visualisations
Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
Heuristics for information visualization evaluation
Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization
Proceedings of the 2008 Workshop on BEyond time and errors: novel evaLuation methods for Information Visualization
Analyzing a socio-technical visualization tool using usability inspection methods
VLHCC '08 Proceedings of the 2008 IEEE Symposium on Visual Languages and Human-Centric Computing
Generating a synthetic video dataset
Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization
A reflection on seven years of the VAST challenge
Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization
Hi-index | 0.00 |
In this paper, we examine reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge. By analyzing these reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. This is a bottom-up approach to the development of heuristics to use in the evaluation of visual analytic environments. The meta-analysis and the results are presented in this paper.