Gesture-Based affective computing on motion capture data

  • Authors:
  • Asha Kapur;Ajay Kapur;Naznin Virji-Babul;George Tzanetakis;Peter F. Driessen

  • Affiliations:
  • School of Medicine, Wake Forest University, North Carolina, United States;University of Victoria, Victoria, British Columbia, Canada;University of Victoria, Victoria, British Columbia, Canada;University of Victoria, Victoria, British Columbia, Canada;University of Victoria, Victoria, British Columbia, Canada

  • Venue:
  • ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper presents research using full body skeletal movements captured using video-based sensor technology developed by Vicon Motion Systems, to train a machine to identify different human emotions. The Vicon system uses a series of 6 cameras to capture lightweight markers placed on various points of the body in 3D space, and digitizes movement into x, y, and z displacement data. Gestural data from five subjects was collected depicting four emotions: sadness, joy, anger, and fear. Experimental results with different machine learning techniques show that automatic classification of this data ranges from 84% to 92% depending on how it is calculated. In order to put these automatic classification results into perspective a user study on the human perception of the same data was conducted with average classification accuracy of 93%.