Hybrid user interfaces: breeding virtually bigger interfaces for physically smaller computers
UIST '91 Proceedings of the 4th annual ACM symposium on User interface software and technology
VoiceNotes: a speech interface for a hand-held voice notetaker
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mobile ADVICE: an accessible device for visually impaired capability enhancement
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Inertial Head-Tracker Sensor Fusion by a Complimentary Separate-Bias Kalman Filter
VRAIS '96 Proceedings of the 1996 Virtual Reality Annual International Symposium (VRAIS 96)
Prototyping retractable string-based interaction techniques for dual-display mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Use your head: exploring face tracking for mobile interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Adaptive blind interaction technique for touchscreens
Universal Access in the Information Society
Can we do without GUIs? Gesture and speech interaction with a patient information system
Personal and Ubiquitous Computing
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A motion-based marking menu system
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Flashlight jigsaw: an exploratory study of an ad-hoc multi-player game on public displays
Proceedings of the 2008 ACM conference on Computer supported cooperative work
Piles across space: Breaking the real-estate barrier on small-display devices
International Journal of Human-Computer Studies
Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Facilitating photographic documentation of accessibility in street scenes
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 10th SIGPLAN symposium on New ideas, new paradigms, and reflections on programming and software
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A smart watch-based gesture recognition system for assisting people with visual impairments
Proceedings of the 3rd ACM international workshop on Interactive multimedia on mobile & portable devices
How to make large touch screens usable while driving
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Haptic target acquisition to enable spatial gestures in nonvisual displays
Proceedings of Graphics Interface 2013
Hi-index | 0.00 |
Accessing the advanced functions of a mobile phone is not a trivial task for users with visual impairments. They rely on screen readers and voice commands to discover and execute functions. In mobile situations, however, screen readers are not ideal because users may depend on their hearing for safety, and voice commands are difficult for a system to recognize in noisy environments. In this paper, we extend Virtual Shelves--an interaction technique that leverages proprioception to access application shortcuts--for visually impaired users. We measured the directional accuracy of visually impaired participants and found that they were less accurate than people with vision. We then built a functional prototype that uses an accelerometer and a gyroscope to sense its position and orientation. Finally, we evaluated the interaction and prototype by allowing participants to customize the placement of seven shortcuts within 15 regions. Participants were able to access shortcuts in their personal layout with 88.3% accuracy in an average of 1.74 seconds.