Real-time gaze tracking for public displays

  • Authors:
  • Andreas Sippl;Clemens Holzmann;Doris Zachhuber;Alois Ferscha

  • Affiliations:
  • Johannes Kepler University Linz, Institute for Pervasive Computing;Upper Austria University of Applied Sciences, Mobile Computing;Johannes Kepler University Linz, Institute for Pervasive Computing;Johannes Kepler University Linz, Institute for Pervasive Computing

  • Venue:
  • AmI'10 Proceedings of the First international joint conference on Ambient intelligence
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we explore the real-time tracking of human gazes in front of large public displays. The aim of our work is to estimate at which area of a display one ore more people are looking at a time, independently from the distance and angle to the display as well as the height of the tracked people. Gaze tracking is relevant for a variety of purposes, including the automatic recognition of the user's focus of attention, or the control of interactive applications with gaze gestures. The scope of the present paper is on the former, and we show how gaze tracking can be used for implicit interaction in the pervasive advertising domain. We have developed a prototype for this purpose, which (i) uses an overhead mounted camera to distinguish four gaze areas on a large display, (ii) works for a wide range of positions in front of the display, and (iii) provides an estimation of the currently gazed quarters in real time. A detailed description of the prototype as well as the results of a user study with 12 participants, which show the recognition accuracy for different positions in front of the display, are presented.