The rising pitch metaphor: an empirical study
International Journal of Human-Computer Studies
Exploring the use of structured musical stimuli to communicate simple diagrams: the role of context
International Journal of Human-Computer Studies
iSonic: interactive sonification for non-visual data exploration
Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility
A haptic/acoustic application to allow blind the access to spatial information
WHC '07 Proceedings of the Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Exploration of directional-predictive sounds for nonvisual interaction with graphs
Knowledge and Information Systems
Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data
ACM Transactions on Computer-Human Interaction (TOCHI)
ACM Transactions on Accessible Computing (TACCESS)
TeslaTouch: electrovibration for touch surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
MudPad: tactile feedback for touch surfaces
CHI '11 Extended Abstracts on Human Factors in Computing Systems
REVEL: tactile feedback technology for augmented reality
ACM SIGGRAPH 2012 Emerging Technologies
Hi-index | 0.00 |
Spatial data are increasingly available, but the ubiquitous use of graphical displays to communicate such data renders it inaccessible to people who are blind or low vision. Not only does this affect the level of access to data, it also results in limited educational opportunities due to a lack of accessible maps and geographic information systems. This lack may be due in part to the challenge of creating a system that provides a usable display without relying on vision. A simple replacement of symbology from a map intended for a two dimensional graphical display with parameters for other modalities such as audio with one primary axis (time) is insufficient. To address the need for an accessible learning materials, we present a minimal geographic information system (mGIS) that uses an auditory display in combination with a tablet with stylus input device. Non-speech audio communicates attribute data, text-to-speech software renders feedback from the application menus, and kinesthetic feedback from actively controlling the stylus conveys location within the display. This paper presents details of the software implementation, discusses the development of an auditory symbology for choropleth maps (maps that display patterns of data over geographic space), and describes initial evaluation of usability.