Eye typing off the shelf

  • Authors:
  • Dan Witzner Hansen;Arthur Pece

  • Affiliations:
  • Dept. of Innovation, IT University Copenhagen, Copenhagen, Denmark;Dept. of Computer Science, University of Copenhagen, Copenhagen, Denmark

  • Venue:
  • CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The goal of this work is using off-the-shelf components for gaze-based interaction, with focus on eye typing. Avoiding the use of dedicated hardware such as IR light emitters makes eye tracking significantly more difficult and requires robust methods capable of handling large changes in image quality. We employ an active-contour method to obtain robust iris tracking. The main strength of the method is that the contour model avoids explicit feature detection: contours are simply assumed to remove statistical dependencies on opposite sides of the contour. The contour model is utilized in an approach combining particle filtering with the EM algorithm. The method is robust against light changes and camera defocusing. For the purpose of determining where the user is looking calibrations is usually needed. The number of calibration points used in different methods varies from from a few to several thousands, depending on the prior knowledge used on the setup and equipment. We examine basic properties of gaze determination when the geometry of the the camera, screen and user is unknown. In particular we present a lower bound on the number of calibration points needed for gaze determination on planar objects, and we examine degenerate configurations. Based on this lower bound we apply a simple calibration procedure, to facilitate button selections for fast on-screen typing.