Multimodal subjectivity analysis of multiparty conversation

  • Authors:
  • Stephan Raaijmakers;Khiet Truong;Theresa Wilson

  • Affiliations:
  • TNO Information and Communication Technology, Delft, The Netherlands;TNO Defense, Security and Safety Soesterberg, The Netherlands;University of Edinburgh, Edingburgh, UK

  • Venue:
  • EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We investigate the combination of several sources of information for the purpose of subjectivity recognition and polarity classification in meetings. We focus on features from two modalities, transcribed words and acoustics, and we compare the performance of three different textual representations: words, characters, and phonemes. Our experiments show that character-level features outperform wordlevel features for these tasks, and that a careful fusion of all features yields the best performance.1