EMPATH: face, emotion, and gender recognition using holons
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Artificial Intelligence Review - Special issue on lazy learning
Coding, Analysis, Interpretation, and Recognition of Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Looking at People: Sensing for Ubiquitous and Wearable Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Digital Image Processing: A Practical Introduction Using Java (with CD-ROM)
Digital Image Processing: A Practical Introduction Using Java (with CD-ROM)
Handbook of Computer Vision Algorithms in Image Algebra
Handbook of Computer Vision Algorithms in Image Algebra
IEEE Computer Graphics and Applications
Automated Facial Expression Recognition Based on FACS Action Units
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Coding Facial Expressions with Gabor Wavelets
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
An Expert System for Multiple Emotional Classification of Facial Expressions
ICTAI '99 Proceedings of the 11th IEEE International Conference on Tools with Artificial Intelligence
A generative framework for real time object detection and classification
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Hi-index | 0.00 |
In this paper, we have presented a fully automatic technique for detection and classification of the six basic facial expressions from nearly frontal face images. Facial expressions are communicated by subtle changes in one or more discrete features such as tightening the lips, raising the eyebrows, opening and closing of eyes or certain combinations of them. These discrete features can be identified through monitoring the changes in muscles movement (Action Units) located near about the regions of mouth, eyes and eyebrows. In this work, we have used eleven feature points that represent and identify the principle muscle actions as well as provide measurements of the discrete features responsible for each of the six basic human emotions. A multi-detector approach of facial feature point localization has been utilized for identifying these points of interests from the contours of facial components such as eyes, eyebrows and mouth. Feature vector composed of eleven features is then obtained by calculating the degree of displacement of these eleven feature points from a nonchangeable rigid point. Finally, the obtained feature sets are used for training a K-Nearest Neighbor Classifier so that it can classify facial expressions when given to it in the form of a feature set. The developed Automatic Facial Expression Classifier has been tested on a publicly available facial expression database and on an average 90.76% successful classification rate has been achieved.