The acoustics of eye contact: detecting visual attention from conversational audio cues

  • Authors:
  • Florian Eyben;Felix Weninger;Lucas Paletta;Björn W. Schuller

  • Affiliations:
  • TUM, Munich, Germany;TUM, Munich, Germany;Joanneum Research, Graz, Austria;Joanneum Research / Imperial College London, Graz, Austria

  • Venue:
  • Proceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

An important aspect in short dialogues is attention as is manifested by eye-contact between subjects. In this study we provide a first analysis whether such visual attention is evident in the acoustic properties of a speaker's voice. We thereby introduce the multi-modal GRAS2 corpus, which was recorded for analysing attention in human-to-human interactions of short daily-life interactions with strangers in public places in Graz, Austria. Recordings of four test subjects equipped with eye tracking glasses, three audio recording devices, and motion sensors are contained in the corpus. We describe how we robustly identify speech segments from the subjects and other people in an unsupervised manner from multi-channel recordings. We then discuss correlations between the acoustics of the voice in these segments and the point of visual attention of the subjects. A significant relation between the acoustic features and the distance between the point of view and the eye region of the dialogue partner is found. Further, we show that automatic classification of binary decision eye-contact vs. no eye-contact from acoustic features alone is feasible with an Unweighted Average Recall of up to 70%.