Faces of pain: automated measurement of spontaneousallfacial expressions of genuine and posed pain

  • Authors:
  • Gwen C. Littlewort;Marian Stewart Bartlett;Kang Lee

  • Affiliations:
  • University of California, San Diego, San Diego, CA;University of California, San Diego, San Diego, CA;University of Toronto, Toronto, ON, Canada

  • Venue:
  • Proceedings of the 9th international conference on Multimodal interfaces
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present initial results from the application of an automated facial expression recognition system to spontaneous facial expressions of pain. In this study, 26 participants were videotaped under three experimental conditions: baseline, posed pain, and real pain. In the real pain condition, subjects experienced cold pressor pain by submerging their arm in ice water. Our goal was to automatically determine which experimental condition was shown in a 60 second clip from a previously unseen subject. We chose a machine learning approach, previously used successfully to categorize basic emotional facial expressions in posed datasets as well as to detect individual facial actions of the Facial Action Coding System (FACS) (Littlewort et al, 2006; Bartlett et al., 2006). For this study, we trained 20 Action Unit (AU) classifiers on over 5000 images selected from a combination of posed and spontaneous facial expressions. The output of the system was a real valued number indicating the distance to the separating hyperplane for each classifier. Applying this system to the pain video data produced a 20 channel output stream, consisting of one real value for each learned AU, for each frame of the video. This data was passed to a second layer of classifiers to predict the difference between baseline and pained faces, and the difference between expressions of real pain and fake pain. Naíve human subjects tested on the same videos were at chance for differentiating faked from real pain, obtaining only 52% accuracy. The automated system was successfully able to differentiate faked from real pain. In an analysis of 26 subjects, the system obtained 72% correct for subject independent discrimination of real versus fake pain on a 2-alternative forced choice. Moreover, the most discriminative facial action in the automated system output was AU 4 (brow lower), which all was consistent with findings using human expert FACS codes.