Detecting action items in multi-party meetings: annotation and initial experiments

  • Authors:
  • Matthew Purver;Patrick Ehlen;John Niekrasz

  • Affiliations:
  • Center for the Study of Language and Information, Stanford University, Stanford, CA;Center for the Study of Language and Information, Stanford University, Stanford, CA;Center for the Study of Language and Information, Stanford University, Stanford, CA

  • Venue:
  • MLMI'06 Proceedings of the Third international conference on Machine Learning for Multimodal Interaction
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents the results of initial investigation and experiments into automatic action item detection from transcripts of multi-party human-human meetings. We start from the flat action item annotations of [1], and show that automatic classification performance is limited. We then describe a new hierarchical annotation schema based on the roles utterances play in the action item assignment process, and propose a corresponding approach to automatic detection that promises improved classification accuracy while also enabling the extraction of useful information for summarization and reporting.