CyberCode: designing augmented reality environments with visual tags
DARE '00 Proceedings of DARE 2000 on Designing augmented reality environments
Experiences from the design of a ubiquitous computing system for the blind
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Universal design for mobile phones: a case study
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Mobile interaction with visual and RFID tags: a field study on user perceptions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Vibrotactile feedback to aid blind users of mobile guides
Journal of Visual Languages and Computing
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Supporting cooperative design through "living" artefacts
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces
Hi-index | 0.00 |
Real-world tagging technologies, such as RFID or visual codes, have enabled new application scenarios that foster mobile interaction with the physical world. While the application scenarios are promising for many contexts, the technologies are currently lacking accessibility. Especially blind and visually impaired people are not able to interact with tags if they are not aware of their presence. We propose audio-tactile location markers as a remedy to this problem. An audible signal leads users to the tag, which can be identified through tactile exploration. Preliminary user studies with four blindfolded subjects using an initial prototype showed the applicability of using an audible signal for locating tags.