CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
TiltType: accelerometer-supported text entry for very small devices
Proceedings of the 15th annual ACM symposium on User interface software and technology
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Wrist rotation for interaction in mobile contexts
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Sensing foot gestures from the pocket
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
A haptic wristwatch for eyes-free interactions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures
International Journal of Human-Computer Studies
Multimodal mobile interactions: usability studies in real world settings
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Eyes-free interaction with free-hand gestures and auditory menus
International Journal of Human-Computer Studies
Hi-index | 0.00 |
A novel interaction method for eyes-free control of a mobile phone or a media player is introduced. The method utilizes acceleration sensors along three axes to sense input gestures, such as pointing and tilting. A spherical auditory menu and feedback are provided using speech and 3D sound. A gestural pointing interface, multiple menu configurations, and their implementation details is presented. Evaluation results suggest that fast and accurate selection of menu items is possible without visual feedback. Combining the gestural interface, positions of menu items in 3D and a browsing method with a dynamically adjustable target size of the menu items allow large menus with intuitive, easy access.