Feature extraction from faces using deformable templates
International Journal of Computer Vision
Image Representation Using 2D Gabor Wavelets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Recognition by Elastic Bunch Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Coding Facial Expressions with Gabor Wavelets
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Comprehensive Database for Facial Expression Analysis
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Dual-State Parametric Eye Tracking
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Recognizing Lower Face Action Units for Facial Expression Analysis
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Recognition of Asymmetric Facial Action Unit Activities and Intensities
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 1
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
Locating and extracting the eye in human face images
Pattern Recognition
Hybrid method based on topography for robust detection of iris center and eye corners
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Hi-index | 0.00 |
Eyes play important roles in emotion and paralinguistic communications. Detection of eye state is necessaryfor applications such as driver awareness systems. In this paper, we develop an automatic system to detect eye-state action units (AU) based on Facial Action Coding System (FACS) by use of Gabor wavelets in a nearly frontal-viewed image sequence. Three eye-state AU (AU 41, AU42, and AU43) are detected. After tracking the eye corners in the whole sequence, the eye appearance information is extracted at three points of each eye (i.e., inner corner, outer corner, and the point between the inner corner and the outer corner) as a set of multi-scale and multi-orientation Gabor coefficients. Then, the normalized Gabor coefficients are fed into a neural-network-based eye-state AU detector. An average recognition rate of 83% is obtained for 112 images from 17 image sequences of 12 subjects.