Hardware companions?: what online AIBO discussion forums reveal about the human-robotic relationship
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Modeling drivers' speech under stress
Speech Communication - Special issue on speech and emotion
Robotic pets in the lives of preschool children
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Robots as dogs?: children's interactions with the robotic dog AIBO and a live australian shepherd
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Automatic recognition of affective cues in the speech of car drivers to allow appropriate responses
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Detecting emotions in conversations between driver and in-car information systems
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Auditory-Induced Emotion: A Neglected Channel for Communication in Human-Computer Interaction
Affect and Emotion in Human-Computer Interaction
Affect and Emotion in Human-Computer Interaction
EmoVoice -- A Framework for Online Recognition of Emotions from Voice
PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
Emotional vocal expressions recognition using the COST 2102 italian database of emotional speech
COST'09 Proceedings of the Second international conference on Development of Multimodal Interfaces: active Listening and Synchrony
Towards IMACA: intelligent multimodal affective conversational agent
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
Entertainment robots are becoming commonplace in the home. Users are less fearful of interacting with robotic systems however these interactions are often limited to performing pre-recording sequences of actions. The next generation of consumer-level entertainment robots should offer more natural interfacing and more engaging interaction. This paper reports on the development and evaluation of a consumer-level robotic dog with acoustic emotion recognition capabilities. The dog can recognise the emotional state of it's owner from affective cues in the owner's speech and respond with appropriate actions. The evaluation study shows that users can recognise the new robotic dog to be emotionally intelligent and report that this makes the dog appear more `alive'.