Toward mixed method evaluations of scientific visualizations and design process as an evaluation tool

  • Authors:
  • Bret Jackson;Dane Coffey;Lauren Thorson;David Schroeder;Arin M. Ellingson;David J. Nuckley;Daniel F. Keefe

  • Affiliations:
  • University of Minnesota;University of Minnesota;University of Minnesota;University of Minnesota;University of Minnesota;University of Minnesota;University of Minnesota

  • Venue:
  • Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.