Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data

  • Authors:
  • Haixia Zhao;Catherine Plaisant;Ben Shneiderman;Jonathan Lazar

  • Affiliations:
  • University of Maryland;University of Maryland;University of Maryland;Towson University

  • Venue:
  • ACM Transactions on Computer-Human Interaction (TOCHI)
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe the development and evaluation of a tool, iSonic, to assist users with visual impairment in exploring georeferenced data using coordinated maps and tables, augmented with nontextual sounds and speech output. Our in-depth case studies with 7 blind users during 42 hours of data collection, showed that iSonic enabled them to find facts and discover trends in georeferenced data, even in unfamiliar geographical contexts, without special devices. Our design was guided by an Action-by-Design-Component (ADC) framework, which was also applied to scatterplots to demonstrate its generalizability. Video and download is available at www.cs.umd.edu/hcil/iSonic/.