Understanding how scientists use data-display devices for interactive visual computing with geographical models

  • Authors:
  • Alan M. Maceachren;Amy Louise Griffin

  • Affiliations:
  • -;-

  • Venue:
  • Understanding how scientists use data-display devices for interactive visual computing with geographical models
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This dissertation reports the results of an empirical study designed to examine how experts use maps in conjunction with other statistical graphics to think about the problem of Hantavirus Pulmonary Syndrome (HPS) risk. 18 scientists from one of three disciplines (ecology, epidemiology and geography) were asked to use a geographical simulation model to explore the case study problem. They were able to interact with the data and the model through three forms of visual display: maps, scatterplots and time series graphs. To provide a complete picture of users' interaction with the model, data was collected on what participants saw, did and thought while using the model by videorecording participants while they were thinking aloud. The transcribed data were coded with categories designed to provide evidence to address four core research questions: (1) What do participants do with the system? (2) What kinds of information do users attend to in the visual information display? (3) How is the information obtained from the system used? (4) What kinds of hypotheses are generated? Users manipulated maps more often than scatterplots and scatterplots more often than time series graphs. User patterns of visual attention were similar to their patterns of system manipulation. Maps were attended to most commonly when users made comparisons over time, and scatterplots were attended to most commonly when users made comparisons over attributes. The characteristics of the hypotheses a user generated were related to the visual information display type the user attended to. The patterns of attention to tools, tool use and hypothesis characteristics that differed between scientists from different disciplines were related to users' strategies for model exploration. The role of (domain knowledge-based) expertise on tool use is an indirect one: rather than a person's expertise directly influencing his or her choice of tools, this expertise influences his or her choice of exploration strategy, and thereby, choice of tools. This finding provides us with a first step in understanding how users' strategies guide their choice of tools and how the ways in which working with these tools have an impact on how users approach a modeling problem.