eyeCan: affordable and versatile gaze interaction

  • Authors:
  • Sang-won Leigh

  • Affiliations:
  • MIT Media Lab, Cambridge, MA, USA

  • Venue:
  • Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present eyeCan, a software system that promises rich, sophisticated, and still usable gaze interactions with low-cost gaze tracking setups. The creation of this practical system was to drastically lower the hurdle of gaze interaction by presenting easy-to-use gaze gestures, and by reducing the cost-of-entry with the utilization of low precision gaze trackers. Our system effectively compensates for the noise from tracking sensors and involuntary eye movements, boosting both the precision and speed in cursor control. Also the possible variety of gaze gestures was explored and defined. By combining eyelid actions and gaze direction cues, our system provides rich set of gaze events and therefore enables the use of sophisticated applications e.g. playing video games or navigating street view.