A method for evaluating head-controlled computer input devices using Fitts law
Human Factors - Assisting people with functional impairments
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Neck range of motion and use of computer head controls
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration
VR '01 Proceedings of the Virtual Reality 2001 Conference (VR'01)
Inertial Head-Tracker Sensor Fusion by a Complimentary Separate-Bias Kalman Filter
VRAIS '96 Proceedings of the 1996 Virtual Reality Annual International Symposium (VRAIS 96)
Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A motion-based marking menu system
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Wrist rotation for interaction in mobile contexts
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Control centric approach in designing scrolling and zooming user interfaces
International Journal of Human-Computer Studies
Head tilting for interaction in mobile contexts
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Hi-index | 0.00 |
While mobile devices have allowed people to carry out various computing and communication tasks everywhere, it has generally lacked the support for task execution while the user is in motion. This is because the interaction schemes of most mobile applications are centered around the device visual display and when in motion (with the important body parts, such as the head and hands, moving), it is difficult for the user to recognize the visual output on the small hand-carried device display and respond to make the timely and proper input. In this paper, we propose an interface which allows the user to interact with the mobile devices during motion without having to look at it or use one's hands. More specifically, the user interacts, by gaze and head motion gestures, with an invisible virtual interface panel with the help of a head-worn gyro sensor and aural feedback. Since the menu is one of the most prevailing methods of interaction, we investigate and focus on the various forms of menu presentation such as the layout and the number of comfortably selectable menu items. With head motion, it turns out 4×2 or 3×3 grid menu is more effective. The results of this study can be further extended for developing a more sophisticated non-visual oriented mobile interface.