Haptic virtual reality for blind computer users
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
A generic approach for augmenting tactile diagrams with spatial non-speech sounds
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Multimodal virtual reality versus printed medium in visualization for blind people
Proceedings of the fifth international ACM conference on Assistive technologies
Feeling what you hear: tactile feedback for navigation of audio graphs
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MultiVis: improving access to visualisations for visually impaired people
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Non-visual overviews of complex data sets
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Tac-tiles: multimodal pie charts for visually impaired users
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Collaborative Identification of Haptic-Only Objects
EuroHaptics '08 Proceedings of the 6th international conference on Haptics: Perception, Devices and Scenarios
Non-visual Gameplay: Making Board Games Easy and Fun
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Making Microsoft Excel™: multimodal presentation of charts
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Expertise-based performance measures in a virtual training environment
Presence: Teleoperators and Virtual Environments
ACM Transactions on Accessible Computing (TACCESS)
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: applications and services for quality of life - Volume Part III
Hi-index | 0.00 |
Retrieving information presented visually is difficult for visually disabled users. Current accessibility technologies, such as screen readers, fail to convey presentational layout or structure. Information presented in graphs or images is almost impossible to convey through speech alone. In this paper, we present the results of an experimental study investigating the role of touch (haptic) and auditory cues in aiding structure recognition when visual presentation is missing. We hypothesize that by guiding users toward nodes in a graph structure using force fields, users will find it easier to recognize overall structure. Nine participants were asked to explore simple 3D structures containing nodes (spheres or cubes) laid out in various spatial configurations and asked to identify the nodes and draw their overall structure. Various combinations of haptic and auditory feedback were explored. Our results demonstrate that haptic cues significantly helped participants to quickly recognize nodes and structure. Surprisingly, auditory cues alone did not speed up node recognition; however, when they were combined with haptics both node identification and structure recognition significantly improved. This result demonstrates that haptic feedback plays an important role in enabling people to recall spatial layout.