I-Mode Developer's Guide
EmoHeart: conveying emotions in second life based on affect sensing from text
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
Speech-Based Emotion Characterization Using Postures and Gestures in CVEs
CW '10 Proceedings of the 2010 International Conference on Cyberworlds
Hi-index | 0.00 |
Human interaction with mobile devices is currently a very active research area. Speech enriched with emotions is one of the major ways of exchanging ideas, especially via telephony. By analyzing a voice stream using a Hidden Markov Model (HMM) and Log Frequency Cepstral Coefficients (LFPC) based system, different emotions can be recognized. Using a simple Java client the recognized emotions are delivered to a sever as an index number. A mobile client then retrieves the emotion and displays it through colored icons. Each emotion is mapped to a particular color, as it is natural to use colors to represent various expressions. We believe that with the help of this application one could conceivably change the way of talking or avoid chatting with somebody whose emotional state is negative!