Detecting emotions in conversations between driver and in-car information systems

  • Authors:
  • Christian Martyn Jones;Ing-Marie Jonsson

  • Affiliations:
  • School of Mathematical and Computer Sciences, Heriot-Watt University, Edinburgh, UK;Department of Communication, Stanford University, California

  • Venue:
  • ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

Speech interaction with in-car controls is becoming more commonplace as the interaction is considered to be less distracting to the driver. Cars of today are equipped with speech recognition system to dial phone numbers and to control the cockpit environment. Furthermore satellite navigation systems provide the driver with verbal directions to their destination. The paper extends the speech interaction between driver and car to consider automatic recognition of the emotional state of the driver and appropriate responses by the car to improve the driver mood. The emotion of the driver has been found to influence driving performance and by actively responding to the emotional of the driver the car could improve their driving.