Vocal communication of emotion: a review of research paradigms
Speech Communication - Special issue on speech and emotion
Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech
Affect and Emotion in Human-Computer Interaction
Gesture recognition in flow based on PCA and using multiagent system
Proceedings of the 2008 ACM symposium on Virtual reality software and technology
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Planning Small Talk behavior with cultural influences for multiagent systems
Computer Speech and Language
The Semantic Vectors Package: New Algorithms and Public Tools for Distributional Semantics
ICSC '10 Proceedings of the 2010 IEEE Fourth International Conference on Semantic Computing
Smile When You Read This, Whether You Like It or Not: Conceptual Challenges to Affect Detection
IEEE Transactions on Affective Computing
Affect and metaphor sensing in virtual drama
International Journal of Computer Games Technology
Towards more comprehensive listening behavior: beyond the bobble head
IVA'11 Proceedings of the 10th international conference on Intelligent virtual agents
A multimodal database for mimicry analysis
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Hi-index | 0.00 |
We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators also contain emotional implication but were regarded as neutral expressions by the previous interpretation. In this paper, we employ latent semantic analysis to perform topic theme detection and identify target audiences for such inputs. We also discuss how such semantic interpretation of the dialog contexts is used to interpret affect more appropriately during virtual improvisation. Also, in order to build a reliable affect analyser, it is important to detect and combine weak affect indicators from other channels such as body language. Such emotional body language detection also provides a nonintrusive channel to detect users' experience without interfering with the primary task. Thus, we also make initial exploration on affect detection from several universally accepted emotional gestures.