Communicating graphical information to blind users using music: the role of context
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Introduction to Human Factors Engineering (2nd Edition)
Introduction to Human Factors Engineering (2nd Edition)
Interactive sonification of geo-referenced data
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Mappings and metaphors in auditory displays: An experimental assessment
ACM Transactions on Applied Perception (TAP)
ACM Transactions on Applied Perception (TAP)
Representing users in accessibility research
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Representing users in accessibility research
ACM Transactions on Accessible Computing (TACCESS)
Supplemental sonification of a bingo game
Proceedings of the 6th International Conference on Foundations of Digital Games
Learning non-visual graphical information using a touch-based vibro-audio interface
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Enabling people who are blind to experience science inquiry learning through sound-based mediation
Journal of Computer Assisted Learning
Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
The MGIS: a minimal geographic information system accessible to users who are blind
Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems
Hi-index | 0.00 |
Determining patterns in data is an important and often difficult task for scientists and students. Unfortunately, graphing and analysis software typically is largely inaccessible to users with vision impairment. Using sound to represent data (i.e., sonification or auditory graphs) can make data analysis more accessible; however, there are few guidelines for designing such displays for maximum effectiveness. One crucial yet understudied design issue is exactly how changes in data (e.g., temperature) are mapped onto changes in sound (e.g., pitch), and how this may depend on the specific user. In this study, magnitude estimation was used to determine preferred data-to-display mappings, polarities, and psychophysical scaling functions relating data values to underlying acoustic parameters (frequency, tempo, or modulation index) for blind and visually impaired listeners. The resulting polarities and scaling functions are compared to previous results with sighted participants. There was general agreement about polarities obtained with the two listener populations, with some notable exceptions. There was also evidence for strong similarities regarding the magnitudes of the slopes of the scaling functions, again with some notable differences. For maximum effectiveness, sonification software designers will need to consider carefully their intended users’ vision abilities. Practical implications and limitations are discussed.