That is your evidence?: Classifying stance in online political debate

  • Authors:
  • Marilyn A. Walker;Pranav Anand;Rob Abbott;Jean E. Fox Tree;Craig Martell;Joseph King

  • Affiliations:
  • Natural Language and Dialogue Systems Lab, University of California Santa Cruz, USA;Natural Language and Dialogue Systems Lab, University of California Santa Cruz, USA;Natural Language and Dialogue Systems Lab, University of California Santa Cruz, USA;Natural Language and Dialogue Systems Lab, University of California Santa Cruz, USA;Natural Language Processing Lab, Naval Postgraduate School, USA;Natural Language and Dialogue Systems Lab, University of California Santa Cruz, USA

  • Venue:
  • Decision Support Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A growing body of work has highlighted the challenges of identifying the stance that a speaker holds towards a particular topic, a task that involves identifying a holistic subjective disposition. We examine stance classification on a corpus of 4731 posts from the debate website ConvinceMe.net, for 14 topics ranging from the playful to the ideological. We show that ideological debates feature a greater share of rebuttal posts, and that rebuttal posts are significantly harder to classify for stance, for both humans and trained classifiers. We also demonstrate that the number of subjective expressions varies across debates, a fact correlated with the performance of systems sensitive to sentiment-bearing terms. We present results for classifying stance on a per topic basis that range from 60% to 75%, as compared to unigram baselines that vary between 47% and 66%. Our results suggest that features and methods that take into account the dialogic context of such posts improve accuracy.