Wearable interfaces for orientation and wayfinding
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
Interactive sonification of geo-referenced data
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Interactive Sonification of Choropleth Maps
IEEE MultiMedia
Navigation with Auditory Cues in a Virtual Environment
IEEE MultiMedia
Automatic annotation of geographic maps
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
Interactive exploration of city maps with auditory torches
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Toward accessible 3D virtual environments for the blind and visually impaired
Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
UAHCI '09 Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction. Addressing Diversity. Part I: Held as Part of HCI International 2009
Exploring Multimodal Navigation Aids for Mobile Users
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Tangible user interface for the exploration of auditory city maps
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
Making digital maps accessible using vibrations
ICCHP'10 Proceedings of the 12th international conference on Computers helping people with special needs: Part I
AUXie: initial evaluation of a blind-accessible virtual museum tour
Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction
TouchOver map: audio-tactile exploration of interactive maps
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A zero-vision music recording paradigm for visually impaired people
Multimedia Tools and Applications
Hapto-acoustic scene representation
ICCHP'12 Proceedings of the 13th international conference on Computers Helping People with Special Needs - Volume Part II
Advances in Human-Computer Interaction
A multi-modal communication approach to describing the surroundings to mobile users
W2GIS'13 Proceedings of the 12th international conference on Web and Wireless Geographical Information Systems
Uncovering information needs for independent spatial learning for users who are visually impaired
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Hi-index | 0.01 |
Blind or visually impaired people usually do not leave their homes without any assistance, in order to visit unknown cities or places. One reason for this dilemma is, that it is hardly possible for them to gain a non-visual overview about the new place, its landmarks and geographic entities already at home. Sighted people can use a printed or digital map to perform this task. Existing haptic and acoustic approaches today do not provide an economic way to mediate the understanding of a map and relations between objects like distance, direction, and object size. We are providing an interactive three-dimensional sonification interface to explore city maps. A blind person can build a mental model of an area's structure by virtually exploring an auditory map at home. Geographic objects and landmarks are presented by sound areas, which are placed within a sound room. Each type of object is associated with a different sound and can therefore be identified. By investigating the auditory map, the user perceives an idea of the various objects, their directions and relative distances. First user tests show, that users are able to reproduce a sonified city map, which comes close to the original visual city map. With our approach exploring a map with non-speech sound areas provide a new user interface metaphor that offers its potential not only for blind and visually impaired persons but also to applications for sighted persons.