Virtual shapers & movers: form and motion affect sex perception
Proceedings of the 4th symposium on Applied perception in graphics and visualization
Talking bodies: Sensitivity to desynchronization of conversations
ACM Transactions on Applied Perception (TAP)
Simulating believable crowd and group behaviors
ACM SIGGRAPH ASIA 2010 Courses
Evaluating the effect of emotion on gender recognition in virtual humans
Proceedings of the ACM Symposium on Applied Perception
Hi-index | 0.00 |
In this paper, we investigate the ability of humans to determine the sex of conversing characters, based on audio and visual cues. We used a corpus of motions and sounds captured from three male and three female actors conversing about a range of topics. In our Unisensory Experiments, visual and auditory stimuli were presented separately to participants who rated how male or female they found them to be. In our Multisensory Experiments, audio and visual information were integrated to determine how they interacted. We found that audio was much easier to classify than motion, and that audio affected but did not saturate ratings when motion and audio were integrated. Finally, even when informative appearance cues were present, this did not help to disambiguate incongruent motion and audio.