Emotional human-machine interaction: cues from facial expressions

  • Authors:
  • Tessa-Karina Tews;Michael Oehl;Felix W. Siebert;Rainer Höger;Helmut Faasch

  • Affiliations:
  • Leuphana University of Lüneburg, Institute of Experimental Industrial Psychology, Lüneburg, Germany;Leuphana University of Lüneburg, Institute of Experimental Industrial Psychology, Lüneburg, Germany;Leuphana University of Lüneburg, Institute of Experimental Industrial Psychology, Lüneburg, Germany;Leuphana University of Lüneburg, Institute of Experimental Industrial Psychology, Lüneburg, Germany;Leuphana University of Lüneburg, Institute of Experimental Industrial Psychology, Lüneburg, Germany

  • Venue:
  • HI'11 Proceedings of the 2011 international conference on Human interface and the management of information - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Emotion detection provides a promising basis for designing future-oriented human centered design of Human-Machine Interfaces. Affective Computing can facilitate human-machine communication. Such adaptive advanced driver assistance systems (ADAS) which are dependent on the emotional state of the driver can be applied in cars. In contrast to the majority of former studies that only used static recognition methods, we investigated a new dynamic approach for detecting emotions in facial expressions in an artificial setting and in a driving context. By analyzing the changes of an area defined by a number of dots that were arranged on participants' faces, variables were extracted to classify the participants' emotions according to the Facial Action Coding System. The results of our novel way to categorize emotions lead to a discussion on additional applications and limitations that frames an attempted approach of emotion detection in cars. Implications for further research and applications are outlined.