Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Parasocial consensus sampling: combining multiple perspectives to learn virtual human behavior
Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1 - Volume 1
Multi-score learning for affect recognition: the case of body postures
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Inferring mood in ubiquitous conversational video
Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia
Hi-index | 0.00 |
This paper presents a research framework for understanding communicative emotions aroused between people while interacting in conversation. Our advance is to consider how these emotions are perceived by other people, rather than what the target's internal state really is. Because such perception is subjective, we introduce the concept of using a collection of subjective external observations to objectively identify a fact. By treating the difference in perceived state as a probability distribution, we propose a computational model that describes the relationship between the perceived emotion and participants' key nonverbal behaviors, i.e. gaze and facial expressions. We also propose an evaluation method to assess the model by comparing the distributions estimated by using it with those of observers'. This paper describes initial experiments and discusses its potential.