EyePoint: practical pointing and selection using gaze and keyboard

  • Authors:
  • Manu Kumar;Andreas Paepcke;Terry Winograd

  • Affiliations:
  • Stanford University, HCI Group, Stanford, CA;Stanford University, HCI Group, Stanford, CA;Stanford University, HCI Group, Stanford, CA

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

We present a practical technique for pointing and selection using a combination of eye gaze and keyboard triggers. EyePoint uses a two-step progressive refinement process fluidly stitched together in a look-press-look-release action, which makes it possible to compensate for the accuracy limitations of the current state-of-the-art eye gaze trackers. While research in gaze-based pointing has traditionally focused on disabled users, EyePoint makes gaze-based pointing effective and simple enough for even able-bodied users to use for their everyday computing tasks. As the cost of eye gaze tracking devices decreases, it will become possible for such gaze-based techniques to be used as a viable alternative for users who choose not to use a mouse depending on their abilities, tasks and preferences.