"Emo Sim": expressing voice-based emotions in mobile interfaces

  • Authors:
  • Prabath Weerasinghe;Rasika Ranaweera;Senaka Amarakeerthi;Michael Cohen

  • Affiliations:
  • University of Aizu, Aizu-Wakamatsu, Fukushima-ken, Japan;University of Aizu, Aizu-Wakamatsu, Fukushima-ken, Japan;University of Aizu, Aizu-Wakamatsu, Fukushima-ken, Japan;University of Aizu, Aizu-Wakamatsu, Fukushima-ken, Japan

  • Venue:
  • Proceedings of the 13th International Conference on Humans and Computers
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human interaction with mobile devices is currently a very active research area. Speech enriched with emotions is one of the major ways of exchanging ideas, especially via telephony. By analyzing a voice stream using a Hidden Markov Model (HMM) and Log Frequency Cepstral Coefficients (LFPC) based system, different emotions can be recognized. Using a simple Java client the recognized emotions are delivered to a sever as an index number. A mobile client then retrieves the emotion and displays it through colored icons. Each emotion is mapped to a particular color, as it is natural to use colors to represent various expressions. We believe that with the help of this application one could conceivably change the way of talking or avoid chatting with somebody whose emotional state is negative!