Automatic Detection of Learner's Affect From Gross Body Language

  • Authors:
  • Sidney D'Mello;Art Graesser

  • Affiliations:
  • The University of Memphis, Department of Computer Science, Memphis, Tennessee;The University of Memphis, Department of Computer Science, Memphis, Tennessee

  • Venue:
  • Applied Artificial Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We explored the reliability of detecting learners' affect by monitoring their gross body language (body position and arousal) during interactions with an intelligent tutoring system called AutoTutor. Training and validation data on affective states were collected in a learning session with AutoTutor, after which the learners' affective states (i.e., emotions) were rated by the learner, a peer, and two trained judges. An automated body pressure measurement system was used to capture the pressure exerted by the learner on the seat and back of a chair during the tutoring session. We extracted two sets of features from the pressure maps. The first set focused on the average pressure exerted, along with the magnitude and direction of changes in the pressure during emotional experiences. The second set of features monitored the spatial and temporal properties of naturally occurring pockets of pressure. We constructed five data sets that temporally integrated the affective judgments with the two sets of pressure features. The first four datasets corresponded to judgments of the learner, a peer, and two trained judges, whereas the final data set integrated judgments of the two trained judges. Machine-learning experiments yielded affect detection accuracies of 73%, 72%, 70%, 83%, and 74%, respectively (chance = 50%) in detecting boredom, confusion, delight, flow, and frustration, from neutral. Accuracies involving discriminations between two, three, four, and five affective states (excluding neutral) were 71%, 55%, 46%, and 40% with chance rates being 50%, 33%, 25%, and 20%, respectively.