EvoFIT: A holistic, evolutionary facial imaging technique for creating composites
ACM Transactions on Applied Perception (TAP)
Manipulating Video Sequences to Determine the Components of Conversational Facial Expressions
ACM Transactions on Applied Perception (TAP)
Low Dimensional Surface Parameterisation with Applications in Biometrics
MEDIVIS '07 Proceedings of the International Conference on Medical Information Visualisation - BioMedical Visualisation
Machine recognition and representation of neonatal facial displays of acute pain
Artificial Intelligence in Medicine
Rigid Head Motion in Expressive Speech Animation: Analysis and Synthesis
IEEE Transactions on Audio, Speech, and Language Processing
Hi4D-ADSIP 3-D dynamic facial articulation database
Image and Vision Computing
Hi-index | 0.00 |
Human faces play an important role in everyday life, including the expression of person identity, emotion and intentionality, along with a range of biological functions. The human face has also become the subject of considerable research effort, and there has been a shift towards understanding it using stimuli of increasingly more realistic formats. In the current work, we outline progress made in the production of a database of facial expressions in arguably the most realistic format, 3D dynamic. A suitable architecture for capturing such 3D dynamic image sequences is described and then used to record seven expressions (fear, disgust, anger, happiness, surprise, sadness and pain) by 10 actors at 3 levels of intensity (mild, normal and extreme). We also present details of a psychological experiment that was used to formally evaluate the accuracy of the expressions in a 2D dynamic format. The result is an initial, validated database for researchers and practitioners. The goal is to scale up the work with more actors and expression types.