Affective computing
SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Control menus: excecution and control in a single interactor
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Communicating emotions in online chat using physiological sensors and animated text
CHI '04 Extended Abstracts on Human Factors in Computing Systems
eMoto: emotionally engaging interaction
Personal and Ubiquitous Computing
In situ informants exploring an emotional mobile messaging system in their everyday practice
International Journal of Human-Computer Studies
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Journal of Visual Languages and Computing
Towards mood based mobile services and applications
EuroSSC'07 Proceedings of the 2nd European conference on Smart sensing and context
A study of mobile mood awareness and communication through MobiMood
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
User-defined motion gestures for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Identifying emotional states using keystroke dynamics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
From dance to touch: movement qualities for interaction design
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Supporting Multi-user Participation with Affective Multimodal Fusion
C5 '11 Proceedings of the 2011 Ninth International Conference on Creating, Connecting and Collaborating through Computing
The Role of Gesture Types and Spatial Feedback in Haptic Communication
IEEE Transactions on Haptics
Hi-index | 0.00 |
Only intrusive and expensive ways of precisely expressing emotions has been proposed, which are not likely to appear soon in everyday Ubicomp environments. In this paper, we study to which extent we can identify the emotion a user is explicitly expressing through 2D and 3D gestures. Indeed users already often manipulate mobile devices with touch screen and accelerometers. We conducted a field study where we asked participants to explicitly express their emotion through gestures and to report their affective states. We contribute by (1) showing a high number of significant correlations in 3D motion descriptors of gestures and in the arousal dimension; (2) defining a space of affective gestures. We identify (3) groups of descriptors that structure the space and are related to arousal. Finally, we provide with (4) a preliminary model of arousal and we identify (5) interesting patterns in particular classes of gestures. Such results are useful for Ubicomp application designers in order to envision the use of gestures as a cheap and non-intrusive affective modality.