Hi4D-ADSIP 3-D dynamic facial articulation database

  • Authors:
  • Bogdan J. Matuszewski;Wei Quan;Lik-Kwan Shark;Alison S. Mcloughlin;Catherine E. Lightbody;Hedley C. A. Emsley;Caroline L. Watkins

  • Affiliations:
  • Applied Digital Signal and Image Processing (ADSIP) Research Centre, School of Computing, Engineering and Physical Sciences, University of Central Lancashire/ Preston PR1 2HE, United Kingdom;Applied Digital Signal and Image Processing (ADSIP) Research Centre, School of Computing, Engineering and Physical Sciences, University of Central Lancashire/ Preston PR1 2HE, United Kingdom;Applied Digital Signal and Image Processing (ADSIP) Research Centre, School of Computing, Engineering and Physical Sciences, University of Central Lancashire/ Preston PR1 2HE, United Kingdom;School of Health, University of Central Lancashire, Preston PR1 2HE, United Kingdom;School of Health, University of Central Lancashire, Preston PR1 2HE, United Kingdom;Department of Neurology, Royal Preston Hospital, Preston PR2 9HT, United Kingdom;School of Health, University of Central Lancashire, Preston PR1 2HE, United Kingdom

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The face is an important medium used by humans to communicate, and facial articulation also reflects a person's emotional and awareness states, cognitive activity, personality or wellbeing. With the advances in 3-D imaging technology and ever increasing computing power, automatic analysis of facial articulation using 3-D sequences is becoming viable. This paper describes Hi4D-ADSIP - a comprehensive 3-D dynamic facial articulation database, containing scans with high spatial and temporal resolution. The database is designed not only to facilitate studies on facial expression analysis, but also to aid research into clinical diagnosis of facial dysfunctions. The database currently contains 3360 facial sequences captured from 80 healthy volunteers (control subjects) of various age, gender and ethnicity. The database has been validated using psychophysical experiments used to formally evaluate the accuracy of the recorded expressions. The results of baseline automatic facial expression recognition methods using Eigen- and Fisher-faces are also presented alongside some initial results obtained for clinical cases. This database is believed to be one of the most comprehensive repositories of facial 3-D dynamic articulations to date. The extension of this database is currently under construction aiming at building a comprehensive repository of representative facial dysfunctions exhibited by patients with stroke, Bell's palsy and Parkinson's disease.