Automatic understanding of affective and social signals by multimodal mimicry recognition

  • Authors:
  • Xiaofan Sun;Anton Nijholt;Khiet P. Truong;Maja Pantic

  • Affiliations:
  • Human Media Interaction, University of Twente, AE Enschede, The Netherlands;Human Media Interaction, University of Twente, AE Enschede, The Netherlands;Human Media Interaction, University of Twente, AE Enschede, The Netherlands;Human Media Interaction, University of Twente, AE Enschede, The Netherlands and Department of Computing, Imperial College, London, UK

  • Venue:
  • ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human mimicry is one of the important behavioral cues displayed during social interaction that inform us about the interlocutors' interpersonal states and attitudes. For example, the absence of mimicry is usually associated with negative attitudes. A system capable of analyzing and understanding mimicry behavior could enhance social interaction, both in human-human and human-machine interaction, by informing the interlocutors about each other's interpersonal attitudes and feelings of affiliation. Hence, our research focus is the investigation of mimicry in social human-human and human-machine interactions with the aim to help improve the quality of these interactions. In particular, we aim to develop automatic multimodal mimicry analyzers, to enhance affect recognition and social signal understanding systems through mimicry analysis, and to implement mimicry behavior in Embodied Conversational Agents. This paper surveys and discusses the recent work we have carried out regarding these aims. It is meant to serve as an ultimate goal and a guide for determining recommendations for the development of automatic mimicry analyzers to facilitate affective computing and social signal processing.