All talk and all action: strategies for managing voicemail messages
CHI 98 Cconference Summary on Human Factors in Computing Systems
Automated message prioritization: making voicemail retrieval more efficient
CHI '02 Extended Abstracts on Human Factors in Computing Systems
A computational model for the automatic recognition of affect in speech
A computational model for the automatic recognition of affect in speech
A user-independent real-time emotion recognition system for software agents in domestic environments
Engineering Applications of Artificial Intelligence
Resilience in the face of innovation: Household trials with BubbleBoard
International Journal of Human-Computer Studies
Acoustic feature selection for automatic emotion recognition from speech
Information Processing and Management: an International Journal
Computer Speech and Language
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
Automatic emotion recognition from speech a PhD research proposal
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
IG-Based feature extraction and compensation for emotion recognition from speech
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
International Journal of Speech Technology
Inferring pragmatics from dialogue contexts in simulated virtual agent games
AEGS'11 Proceedings of the 2011 international conference on Agents for Educational Games and Simulations
Hi-index | 0.00 |
Voicemail has become an integral part of our personal and professional communication. The number of messages that accumulate in our voice mailboxes necessitate new ways of prioritizing them. Currently, we are forced to actively listen to all messages in order to find out which ones are important and which ones can be attended to later on. In this paper, we describe Emotive Alert, a system that can detect some of the significant emotions in a new message and notify the account owner along various affective axes, including urgency, formality, valence (happy vs. sad) and arousal (calm vs. excited). We have used a purely acoustic, HMM-based approach for identifying the emotions, which allows application of this system to all messages independent of language.