Pursuit calibration: making gaze calibration less tedious and more flexible

  • Authors:
  • Ken Pfeuffer;Melodie Vidal;Jayson Turner;Andreas Bulling;Hans Gellersen

  • Affiliations:
  • Lancaster University, Lancaster, United Kingdom;Lancaster University, Lancaster, United Kingdom;Lancaster University, Lancaster, United Kingdom;Max Planck Institute for Informatics, Saarbrücken, Germany;Lancaster University, Lancaster, United Kingdom

  • Venue:
  • Proceedings of the 26th annual ACM symposium on User interface software and technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the user's attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.