Multimodal object oriented user interfaces in mobile affective interaction

  • Authors:
  • Efthymios Alepis;Maria Virvou

  • Affiliations:
  • Department of Informatics, University of Piraeus, Piraeus, Greece 18534;Department of Informatics, University of Piraeus, Piraeus, Greece 18534

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we investigate an object oriented (OO) architecture for multimodal emotion recognition in interactive applications through mobile phones or handheld devices. Mobile phones are different from desktop computers since mobile phones are not performing any processing involving emotion recognition whereas desktop computers can perform such processing. In fact, in our approach, mobile phones have to pass all data collected to a server and then perform emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device's keyboard and the mobile device's microphone, as well as data from emotion stereotypes. Moreover, the OO method classifies them into well structured objects with their own properties and methods. The resulting emotion detection server is capable of using and handling transmitted information from different mobile sources of multimodal data during human-computer interaction. As a test bed for the affective mobile interaction we have used an educational application that is incorporated into the mobile system.