Time series: theory and methods
Time series: theory and methods
Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Affective computing
Perceptual user interfaces: multimodal interfaces that process what comes naturally
Communications of the ACM
What would they think?: a computational model of attitudes
Proceedings of the 9th international conference on Intelligent user interfaces
International Journal of Human-Computer Studies - Special issue: Subtle expressivity for characters and robots
Entertainment feature of a game using skin conductance response
Proceedings of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology
ALMA: a layered model of affect
Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems
Emotion Recognition Based on Joint Visual and Audio Cues
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
Brain-Computer Interfacing for Intelligent Systems
IEEE Intelligent Systems
Emotionally Expressive Head and Body Movement During Gaze Shifts
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Recognizing Affective Dimensions from Body Posture
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Affective Loop Experiences --- What Are They?
PERSUASIVE '08 Proceedings of the 3rd international conference on Persuasive Technology
Number of people required for usability evaluation: the 10±2 rule
Communications of the ACM
Multi-scale entropy analysis of dominance in social creative activities
Proceedings of the international conference on Multimedia
Multimodal human computer interaction: a survey
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
Proceedings of the 2013 international conference on Intelligent user interfaces
Hi-index | 0.00 |
In this paper we present an evaluation of an affective multimodal fusion approach utilizing dimensional representations of emotion. The evaluation uses physiological signals as a reference measure of users' emotional states. Surface electromyography (EMG) and galvanic skin response (GSR) signals are known to be correlated with specific dimensions of emotion (Pleasure and Arousal) and are compared here to real time continuous values of these dimensions obtained from affective multimodal fusion. The results (both qualitative and quantitative) suggest that the particular multimodal fusion approach described is consistent with physiological indicators of emotion, constituting a first positive evaluation of the approach.