Affective computing
Agents that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent
Improving automotive safety by pairing driver emotion and car voice emotion
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Performance analysis of acoustic emotion recognition for in-car conversational interfaces
UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
Affect and Emotion in Human-Computer Interaction
The Composite Sensing of Affect
Affect and Emotion in Human-Computer Interaction
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
Interactive speech based systems are moving into the car since speech interactions are considered less detrimental to the driver than interactions with a display. The introduction of in-car speech-based interactions highlights the potential influence of linguistic and paralinguistic cues such as emotion. Emotions direct and focus people's attention on objects and situations, and affects performance, judgment and risk-taking. All of these properties are crucial for driving where the smallest slip-up can have grave repercussions. Emotional cues in a car-voice, paired with the emotional state of the driver, have been found to influence driving performance. This initiated the design of an in-car driver emotion detection and response system. Results show that the in-car system can recognise and track changes in the emotional state of the driver. This study considers older drivers who often feel both unsafe and insecure due to concerns about declining abilities and in particular vision.