Location by parts: model generation and feature fusion for mobile eye pupil tracking under challenging lighting

  • Authors:
  • Thomas B. Kinsman;Jeff B. Pelz

  • Affiliations:
  • Rochester Institute of Technology, Rochester, NY;Rochester Institute of Technology, Rochester, NY

  • Venue:
  • Proceedings of the 2012 ACM Conference on Ubiquitous Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Using infrared based mobile eye trackers outdoors is difficult, and considered intractable by some [1, 2]. The challenge of bright uncontrolled daylight illumination complicates the process of locating the subject's pupil. To make mobile eye tracking more ubiquitous, we are developing more sophisticated algorithms to find the subject's pupil. We use a semi-supervised process to initiate the pupil tracking, automatically generate an ensemble of models of the pupil for each video, and use multi-frame techniques to help locate the pupil across frames. A mixture of experts (consensus) is used to indicate a good estimate of pupil location. The algorithm presented here details developing work in automatically finding the pupil in situations where there is a significant amount of light reflecting off the eye, when the subject is squinting, and when the pupil is partially occluded. The output of this algorithm will be cascaded into a subsequent stage for exact pupil fitting.