Robustifying Eye Interaction

  • Authors:
  • Dan Witzner Hansen;John Paulin Hansen

  • Affiliations:
  • IT University, Copenhagen;IT University, Copenhagen

  • Venue:
  • CVPRW '06 Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a gaze typing system based on consumer hardware. Eye tracking based on consumer hardware is subject to several unknown factors. We propose methods using robust statistical principles to accommodate uncertainties in image data as well as in gaze estimates to improve accuracy. We have succeeded to track the gaze of people with a standard consumer camera, obtaining accuracies about 160 pixels on screen. Proper design of the typing interface, however, reduces the need for high accuracy. We have observed typing speeds in the range of 3 - 5 words per minute for untrained subjects using large on-screen buttons and a new noise tolerant dwell-time principle