Visualizing non-speech sounds for the deaf

  • Authors:
  • Tara Matthews;Janette Fong;Jennifer Mankoff

  • Affiliations:
  • EECS, UC Berkeley, Berkeley, CA;HCII, Carnegie Mellon, Pittsburgh, PA;HCII, Carnegie Mellon, Pittsburgh, PA

  • Venue:
  • Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Sounds constantly occur around us, keeping us aware of our surroundings. People who are deaf have difficulty maintaining an awareness of these ambient sounds. We present an investigation of peripheral, visual displays to help people who are deaf maintain an awareness of sounds in the environment. Our contribution is twofold. First, we present a set of visual design preferences and functional requirements for peripheral visualizations of non-speech audio that will help improve future applications. Visual design preferences include ease of interpretation, glance-ability, and appropriate distractions. Functional requirements include the ability to identify what sound occurred, view a history of displayed sounds, customize the information that is shown, and determine the accuracy of displayed information. Second, we designed, implemented, and evaluated two fully functioning prototypes that embody these preferences and requirements, serving as examples for future designers and furthering progress toward understanding how to best provide peripheral audio awareness for the deaf.