Movements and voices affect perceived sex of virtual conversers

  • Authors:
  • Rachel McDonnell;Carol O'Sullivan

  • Affiliations:
  • Trinity College Dublin;Trinity College Dublin

  • Venue:
  • Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we investigate the ability of humans to determine the sex of conversing characters, based on audio and visual cues. We used a corpus of motions and sounds captured from three male and three female actors conversing about a range of topics. In our Unisensory Experiments, visual and auditory stimuli were presented separately to participants who rated how male or female they found them to be. In our Multisensory Experiments, audio and visual information were integrated to determine how they interacted. We found that audio was much easier to classify than motion, and that audio affected but did not saturate ratings when motion and audio were integrated. Finally, even when informative appearance cues were present, this did not help to disambiguate incongruent motion and audio.