Conversion between prosodic transcription systems: “Standard British” and ToBI
Speech Communication
Describing the emotional states that are expressed in speech
Speech Communication - Special issue on speech and emotion
Describing the emotional states that are expressed in speech
Speech Communication - Special issue on speech and emotion
Levels of representation in the annotation of emotion for the specification of expressivity in ECAs
Lecture Notes in Computer Science
2005 Special Issue: Beyond emotion archetypes: Databases for emotion modelling using neural networks
Neural Networks - Special issue: Emotion and brain
2005 Special Issue: Challenges in real-life emotion annotation and machine learning based detection
Neural Networks - Special issue: Emotion and brain
ASR for emotional speech: Clarifying the issues and enhancing performance
Neural Networks - Special issue: Emotion and brain
Observer annotation of affective display and evaluation of expressivity: face vs. face-and-body
VisHCI '06 Proceedings of the HCSNet workshop on Use of vision in human-computer interaction - Volume 56
Applying an analysis of acted vocal emotions to improve the simulation of synthetic speech
Computer Speech and Language
Fully generated scripted dialogue for embodied agents
Artificial Intelligence
Fear-type emotion recognition for future audio-based surveillance systems
Speech Communication
A model of gaze for the purpose of emotional expression in virtual embodied agents
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech
Affect and Emotion in Human-Computer Interaction
A three-layered model for expressive speech perception
Speech Communication
Personal and Ubiquitous Computing
Speech Emotion Perception by Human and Machine
Verbal and Nonverbal Features of Human-Human and Human-Machine Interaction
Gaze and Gesture Activity in Communication
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
A privacy-sensitive approach to modeling multi-person conversations
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
A Wizard-of-Oz game for collecting emotional audio data in a children-robot interaction
Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots
Automatic recognition of speech emotion using long-term spectro-temporal features
DSP'09 Proceedings of the 16th international conference on Digital Signal Processing
IEEE Transactions on Audio, Speech, and Language Processing
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
A study on speech with manifest emotions
TSD'07 Proceedings of the 10th international conference on Text, speech and dialogue
Objective and subjective evaluation of an expressive speech corpus
NOLISP'07 Proceedings of the 2007 international conference on Advances in nonlinear speech processing
Recognition of emotional state in Polish speech: comparison between human and automatic efficiency
BioID_MultiComm'09 Proceedings of the 2009 joint COST 2101 and 2102 international conference on Biometric ID management and multimodal communication
Automatic inference of complex affective states
Computer Speech and Language
Prosody-preserving voice transformation to evaluate brain representations of speech sounds
IEEE Transactions on Audio, Speech, and Language Processing
Estonian Emotional Speech Corpus: Culture and Age in Selecting Corpus Testers
Proceedings of the 2010 conference on Human Language Technologies -- The Baltic Perspective: Proceedings of the Fourth International Conference Baltic HLT 2010
Uncertainty in Spoken Dialogue Management
Proceedings of the 2010 conference on Human Language Technologies -- The Baltic Perspective: Proceedings of the Fourth International Conference Baltic HLT 2010
ACM Transactions on Intelligent Systems and Technology (TIST)
Proceedings of the Third COST 2102 international training school conference on Toward autonomous, adaptive, and context-aware multimodal interfaces: theoretical and practical issues
Automatic recognition of emotional state in Polish speech
Proceedings of the Third COST 2102 international training school conference on Toward autonomous, adaptive, and context-aware multimodal interfaces: theoretical and practical issues
Designing a hungarian multimodal database - speech recording and annotation
Proceedings of the Third COST 2102 international training school conference on Toward autonomous, adaptive, and context-aware multimodal interfaces: theoretical and practical issues
Automatic speech emotion recognition using modulation spectral features
Speech Communication
ikannotate - a tool for labelling, transcription, and annotation of emotionally coloured speech
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Towards real-time affect detection based on sample entropy analysis of expressive gesture
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
EMOGIB: emotional gibberish speech database for affective human-robot interaction
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Developing a consistent view on emotion-oriented computing
MLMI'05 Proceedings of the Second international conference on Machine Learning for Multimodal Interaction
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Annotating multimodal behaviors occurring during non basic emotions
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Real-Life emotion representation and detection in call centers data
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Piecing together the emotion jigsaw
MLMI'04 Proceedings of the First international conference on Machine Learning for Multimodal Interaction
CONTEXT'05 Proceedings of the 5th international conference on Modeling and Using Context
Pointing gestures and synchronous communication management
COST'09 Proceedings of the Second international conference on Development of Multimodal Interfaces: active Listening and Synchrony
Influence of speakers' emotional states on voice recognition scores
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
Classification of emotional speech using 3DEC hierarchical classifier
Speech Communication
Emotion recognition from speech: a review
International Journal of Speech Technology
Vocal markers of emotion: Comparing induction and acting elicitation
Computer Speech and Language
EmoTales: creating a corpus of folk tales with emotional annotations
Language Resources and Evaluation
International Journal of Synthetic Emotions
International Journal of Technology Diffusion
Hi-index | 0.00 |
Research on speech and emotion is moving from a period of exploratory research into one where there is a prospect of substantial applications, notably in human-computer interaction. Progress in the area relies heavily on the development of appropriate databases. This paper addresses four main issues that need to be considered in developing databases of emotional speech: scope, naturalness, context and descriptors. The state of the art is reviewed. A good deal has been done to address the key issues, but there is still a long way to go. The paper shows how the challenge of developing appropriate databases is being addressed in three major recent projects--the Reading--Leeds project, the Belfast project and the CREST--ESP project. From these and other studies the paper draws together the tools and methods that have been developed, addresses the problems that arise and indicates the future directions for the development of emotional speech databases.