Unobtrusive Sensing of Emotions (USE)

  • Authors:
  • Egon L. van den Broek;Marleen H. Schut;Joyce H. D. M. Westerink;Kees Tuinenbreijer

  • Affiliations:
  • (Correspd. E-mail: vandenbroek@acm.org) Center for Telematics and Information Technology (CTIT), University of Twente, P.O. Box 217, 7500 AE, Enschede, The Netherlands;Philips Consumer Lifestyle Advanced Technology, High Tech Campus 37, 5656 AE, Eindhoven, The Netherlands E-mail: {marleen.schut,kees.tuinenbreijer}@philips.com;User Experience Group, Philips Research, High Tech Campus 34, 5656 AE, Eindhoven, The Netherlands E-mail: joyce.westerink@philips.com;Philips Consumer Lifestyle Advanced Technology, High Tech Campus 37, 5656 AE, Eindhoven, The Netherlands E-mail: {marleen.schut,kees.tuinenbreijer}@philips.com

  • Venue:
  • Journal of Ambient Intelligence and Smart Environments
  • Year:
  • 2009

Quantified Score

Hi-index 0.02

Visualization

Abstract

Emotions are acknowledged as a crucial element for artificial intelligence; this is, as is illustrated, no different for Ambient Intelligence (AmI). Unobtrusive Sensing of Emotions (USE) is introduced to enrich AmI with empathic abilities. USE coins the combination of speech and the electrocardiogram (ECG) as a powerful and unique combination to unravel people's emotions. In a controlled study, 40 people watched film scenes, in either an office or a home-like setting. It is shown that, when people's gender is taken into account, both heart rate variability (derived from the ECG) and the standard deviation of the fundamental frequency of speech indicate people's experienced valence and arousal, in parallel. As such, both measures validate each other. Thus, through USE reliable cues can be derived that indicate the emotional state of people, in particular when also people's environment is taken into account. Since all this is crucial for both AI and true AmI, this study provides a first significant leap forward in making AmI a success.