Automatic Analysis of Facial Expressions: The State of the Art
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comprehensive Database for Facial Expression Analysis
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Affective multimodal human-computer interaction
Proceedings of the 13th annual ACM international conference on Multimedia
AAM Derived Face Representations for Robust Facial Action Recognition
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Fully Automatic Facial Action Recognition in Spontaneous Behavior
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Spontaneous vs. posed facial behavior: automatic analysis of brow actions
Proceedings of the 8th international conference on Multimodal interfaces
Multimodal coordination of facial action, head rotation, and eye motion during spontaneous smiles
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Image and Vision Computing
Challenges for virtual humans in human computing
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Affect detection and an automated improvisational AI actor in E-drama
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Hi-index | 0.00 |
Many people believe that emotions and subjective feelings are one and the same and that a goal of human-centered computing is emotion recognition. The first belief is outdated; the second mistaken. For human-centered computing to succeed, a different way of thinking is needed. Emotions are species-typical patterns that evolved because of their value in addressing fundamental life tasks. Emotions consist of multiple components, of which subjective feelings may be one. They are not directly observable, but inferred from expressive behavior, self-report, physiological indicators, and context. I focus on expressive facial behavior because of its coherence with other indicators and research. Among the topics included are measurement, timing, individual differences, dyadic interaction, and inference. I propose that design and implementation of perceptual user interfaces may be better informed by considering the complexity of emotion, its various indicators, measurement, individual differences, dyadic interaction, and problems of inference.