Perceptual analysis of talking avatar head movements: a quantitative perspective

  • Authors:
  • Xiaohan Ma;Binh Huy Le;Zhigang Deng

  • Affiliations:
  • University of Houston, Houston, Texas, USA;University of Houston, Houston, Texas, USA;University of Houston, Houston, Texas, USA

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

Lifelike interface agents (e.g. talking avatars) have been increasingly used in human-computer interaction applications. In this work, we quantitatively analyze how human perception is affected by audio-head motion characteristics of talking avatars. Specifically, we quantify the correlation between perceptual user ratings (obtained via user study) and joint audio-head motion features as well as head motion patterns in the frequency-domain. Our quantitative analysis results clearly show that the correlation coefficient between the pitch of speech signals (but not the RMS energy of speech signals) and head motions is approximately linearly proportional to the perceptual user rating, and a larger proportion of high frequency signals in talking avatar head movements tends to degrade the user perception in terms of naturalness.