Towards Knowledge-Based Affective Interaction: Situational Interpretation of Affect

  • Authors:
  • Abdul Rehman Abbasi;Takeaki Uno;Matthew N. Dailey;Nitin V. Afzulpurkar

  • Affiliations:
  • Asian Institute of Technology, Bangkok, Thailand;National Institute of Informatics, Tokyo, Japan;Asian Institute of Technology, Bangkok, Thailand;Asian Institute of Technology, Bangkok, Thailand

  • Venue:
  • ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human-to-computer interaction in a variety of applications could benefit if systems could accurately analyze and respond to their users' affect. Although a great deal of research has been conducted on affect recognition, very little of this work has considered what is the appropriate information to extract in specific situations. Towards understanding how specific applications such as affective tutoring and affective entertainment could benefit, we present two experiments. In the first experiment, we found that students' facial expressions, together with their body actions, gave little information about their internal emotion per se but they would be useful features for predicting their self-reported "true" mental state. In the second experiment, we found significant differences between the facial expressions and self-reported affective state of viewers watching a movie sequence. Our results suggest that the noisy relationship between observable gestures and underlying affect must be accounted for when designing affective computing applications.