Parasocial consensus sampling: combining multiple perspectives to learn virtual human behavior

  • Authors:
  • Lixing Huang;Louis-Philippe Morency;Jonathan Gratch

  • Affiliations:
  • University of Southern California, Marina Del Rey, CA;University of Southern California, Marina Del Rey, CA;University of Southern California, Marina Del Rey, CA

  • Venue:
  • Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1 - Volume 1
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual humans are embodied software agents that should not only be realistic looking but also have natural and realistic behaviors. Traditional virtual human systems learn these interaction behaviors by observing how individuals respond in face-to-face situations (i.e., dir ect interaction). In contrast, this paper introduces a novel methodological approach called paras ocial consensus sampling (PCS) which allows multiple individuals to vicariously experience the same situation to gain insight on the typical (i.e., consensus view) of human responses in social interaction. This approach can help tease a part what is idiosyncratic from what is essential and help reveal the strength of cues that elicit social responses. Our PCS approach has several advantages over traditional methods: (1) it integrates data from multiple independent listeners interacting with the same speaker, (2) it associates probability of how likely feedback will be given over time, (3) it can be used as a prior to analyze and understand the face-to-face interaction data, (4) it facilitates much quicker and cheaper data collection. In this paper, we apply our PCS approach to learn a predictive model of listener backchannel feedback. Our experiments demonstrate that a virtual human driven by our PCS approach creates significantly more rapport and is perceived as more believable than the virtual human driven by face-to-face interaction data.