Gaze-directed hands-free interface for mobile interaction

  • Authors:
  • Gie-seo Park;Jong-gil Ahn;Gerard J. Kim

  • Affiliations:
  • Digital Experience Laboratory Korea University, Seoul, Korea;Digital Experience Laboratory Korea University, Seoul, Korea;Digital Experience Laboratory Korea University, Seoul, Korea

  • Venue:
  • HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

While mobile devices have allowed people to carry out various computing and communication tasks everywhere, it has generally lacked the support for task execution while the user is in motion. This is because the interaction schemes of most mobile applications are centered around the device visual display and when in motion (with the important body parts, such as the head and hands, moving), it is difficult for the user to recognize the visual output on the small hand-carried device display and respond to make the timely and proper input. In this paper, we propose an interface which allows the user to interact with the mobile devices during motion without having to look at it or use one's hands. More specifically, the user interacts, by gaze and head motion gestures, with an invisible virtual interface panel with the help of a head-worn gyro sensor and aural feedback. Since the menu is one of the most prevailing methods of interaction, we investigate and focus on the various forms of menu presentation such as the layout and the number of comfortably selectable menu items. With head motion, it turns out 4×2 or 3×3 grid menu is more effective. The results of this study can be further extended for developing a more sophisticated non-visual oriented mobile interface.