Fundamentals of speech recognition
Fundamentals of speech recognition
Face and body gesture recognition for a vision-based multimodal analyzer
VIP '05 Proceedings of the Pan-Sydney area workshop on Visual information processing
Symbolic representation of two-dimensional shapes
Pattern Recognition Letters
Translation and scale invariants of Tchebichef moments
Pattern Recognition
Classifying the shape of aggregate using hybrid multilayered perceptron network
ICS'05 Proceedings of the 9th WSEAS International Conference on Systems
Shape representation and description using the Hilbert curve
Pattern Recognition Letters
Facial expression recognition based on shape and texture
Pattern Recognition
Concept, content and the convict
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Facial Expression Invariants for Estimating Mental States of Person
Proceedings of the 2009 conference on New Trends in Software Methodologies, Tools and Techniques: Proceedings of the Eighth SoMeT_09
Subjective measurement of cosmetic defects using a Computational Intelligence approach
Engineering Applications of Artificial Intelligence
Parameterized real-time moment computation on gray images using block techniques
Journal of Real-Time Image Processing
Kalman filter-based facial emotional expression recognition
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Journal of Computer and System Sciences
Automatic facial expression recognition based on spatiotemporal descriptors
Pattern Recognition Letters
Hi-index | 0.10 |
Moment invariants are invariant under shifting, scaling and rotation. They are widely used in pattern recognition because of their discrimination power and robustness. HMM method is a natural and highly reliable way of recognition. In this paper, we have proposed a method of using moment invariants as features and HMM as recognition method in facial expression recognition. Sequences of four universal expressions, i.e. anger, disgust, happiness and surprise, are recognized. We were able to attain an accuracy as high as 96.77%.