Empirical evaluation of information visualizations: an introduction
International Journal of Human-Computer Studies - Empirical evaluation of information visualizations
Discovering Statistics Using SPSS
Discovering Statistics Using SPSS
Usability Testing Essentials: Ready, Set...Test!
Usability Testing Essentials: Ready, Set...Test!
Hi-index | 0.00 |
This paper addresses some fundamental issues that need to be considered when reporting an evaluation study in visualization or other areas that develop visual interfaces. Today evaluations are frequently included in publications in these fields, however there exists no uniform standard for reporting formats. Instead there is heterogeneity in the structure and content provided and often important information is missing, making it hard to gain insight and trust in the results and conclusions. Consequently there is a need for an easy to access guidance on how to accomplish sound reporting with high quality. This is the only way in which authors can enable an assessment of the validity of a study and enable it to be replicated by others in order to verify it. This paper presents a first effort to introduce guidelines on what constitutes an effective structure and what content to address and how. It also points out common pitfalls and mistakes when reporting and how these can be avoided. The paper could be used as a guide by authors when describing their evaluation studies and it could also be helpful when reviewing publications presenting such work since the same guidelines for content apply.