Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments

  • Authors:
  • Chiew Seng Sean Tan;Johannes Schöning;Kris Luyten;Karin Coninx

  • Affiliations:
  • Hasselt University - tUL - iMinds, Diepenbeek, Belgium;Hasselt University - tUL - iMinds, Diepenbeek, Belgium;Hasselt University - tUL - iMinds, Diepenbeek, Belgium;Hasselt University - tUL - iMinds, Diepenbeek, Belgium

  • Venue:
  • Proceedings of the 2013 international conference on Intelligent user interfaces
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Intelligent User Interfaces can benefit from having knowledge on the user's emotion. However, current implementations to detect affective states, are often constraining the user's freedom of movement by instrumenting her with sensors. This prevents affective computing from being deployed in naturalistic and ubiquitous computing contexts. In this paper, we present a novel system called mASqUE, which uses a set of association rules to infer someone's affective state from their body postures. This is done without any user instrumentation and using off-the-shelf and non-expensive commodity hardware: a depth camera tracks the body posture of the users and their postures are also used as an indicator of their openness. By combining the posture information with physiological sensors measurements we were able to mine a set of association rules relating postures to affective states. We demonstrate the possibility of inferring affective states from body postures in ubiquitous computing environments and our study also provides insights how this opens up new possibilities for IUI to access the affective states of users from body postures in a nonintrusive way.