A semi-automated system for accurate gaze coding in natural dyadic interactions

  • Authors:
  • Kenneth A. Funes Mora;Laurent Nguyen;Daniel Gatica-Perez;Jean-Marc Odobez

  • Affiliations:
  • Idiap Research Institute and École Polytechnique Fédérale de Lausanne, Martigny, Valais, Switzerland;Idiap Research Institute and École Polytechnique Fédérale de Lausanne, Martigny, Valais, Switzerland;Idiap Research Institute and École Polytechnique Fédérale de Lausanne, Martigny, Valais, Switzerland;Idiap Research Institute and École Polytechnique Fédérale de Lausanne, Martigny, Valais, Switzerland

  • Venue:
  • Proceedings of the 15th ACM on International conference on multimodal interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose a system capable of accurately coding gazing events in natural dyadic interactions. Contrary to previous works, our approach exploits the actual continuous gaze direction of a participant by leveraging on remote RGB-D sensors and a head pose-independent gaze estimation method. Our contributions are: i) we propose a system setup built from low-cost sensors and a technique to easily calibrate these sensors in a room with minimal assumptions; ii) we propose a method which, provided short manual annotations, can automatically detect gazing events in the rest of the sequence; iii) we demonstrate on substantially long, natural dyadic data that high accuracy can be obtained, showing the potential of our system. Our approach is non-invasive and does not require collaboration from the interactors. These characteristics are highly valuable in psychology and sociology research.