Perception of gaze direction for situated interaction

  • Authors:
  • Samer Al Moubayed;Gabriel Skantze

  • Affiliations:
  • Stockholm, Sweden;Stockholm, Sweden

  • Venue:
  • Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects of different gaze cues synthesized using the Furhat back-projected robot head, on the accuracy of perceived spatial direction of gaze by humans using 18 test subjects. The study first quantifies the accuracy of the perceived gaze direction in a human-human setup, and compares that to the use of synthesized gaze movements in different conditions: viewing the robot eyes frontal or at a 45 degrees angle side view. We also study the effect of 3D gaze by controlling both eyes to indicate the depth of the focal point (vergence), the use of gaze or head pose, and the use of static or dynamic eyelids. The findings of the study are highly relevant to the design and control of robots and animated agents in situated face-to-face interaction.