Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
2005 Special Issue: Challenges in real-life emotion annotation and machine learning based detection
Neural Networks - Special issue: Emotion and brain
Emotion representation and physiology assignments in digital systems
Interacting with Computers
Towards a common framework for multimodal generation: the behavior markup language
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Multimodal sensing, interpretation and copying of movements by a virtual agent
PIT'06 Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies
Fear-type emotion recognition for future audio-based surveillance systems
Speech Communication
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
The Role of Affect and Emotion in HCI
Affect and Emotion in Human-Computer Interaction
The Composite Sensing of Affect
Affect and Emotion in Human-Computer Interaction
Computer Speech and Language
EmotionML - an upcoming standard for representing emotions and related states
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Paralinguistics in speech and language-State-of-the-art and the challenge
Computer Speech and Language
Ten recent trends in computational paralinguistics
COST'11 Proceedings of the 2011 international conference on Cognitive Behavioural Systems
Hi-index | 0.00 |
Working with emotion-related states in technological contexts requires a standard representation format. Based on that premise, the W3C Emotion Incubator group was created to lay the foundations for such a standard. The paper reports on two results of the group's work: a collection of use cases, and the resulting requirements. We compiled a rich collection of use cases, and grouped them into three types: data annotation, emotion recognition, and generation of emotion-related behaviour. Out of these, a structured set of requirements was distilled. It comprises the representation of the emotion-related state itself, some meta-information about that representation, various kinds of links to the "rest of the world", and several kinds of global metadata. We summarise the work, and provide pointers to the working documents containing full details.