Interactive sonification of complex data

  • Authors:
  • Sandra Pauletto;Andy Hunt

  • Affiliations:
  • Department of Theatre, Film and Television, The University of York, Heslington, York YO10 5DQ, United Kingdom;Department of Electronics, The University of York, Heslington, York YO10 5DD, United Kingdom

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we present two experiments on implementing interaction in sonification displays: the first focuses on recorded data (interactive navigation) and the second on data gathered in real time (auditory feedback). Complex synthesised data are explored in the first experiment to evaluate how well the known characteristics present in the data are distinguished using different interaction methods, while real medical data (from physiotherapy) are used for the second. The addition of interaction to the exploration of sonified recorded data improves the system usability (efficiency, effectiveness and user satisfaction), and the real-time sonification of complex physiotherapy data can produce sounds with timbral characteristics that audibly change when important characteristics present in the data vary.