Interfaces for discourse summarisation: a human factors analysis

  • Authors:
  • Agata McCormac;Kathryn Parsons;Marcus Butavicius;Aaron Ceglar;Derek Weber;Tim Pattison;Richard Leibbrandt;Kenneth Treharne;David Powers

  • Affiliations:
  • Defence Science and Technology Organisation, Edinburgh, SA;Defence Science and Technology Organisation, Edinburgh, SA;Defence Science and Technology Organisation, Edinburgh, SA;Defence Science and Technology Organisation, Edinburgh, SA;Defence Science and Technology Organisation, Edinburgh, SA;Defence Science and Technology Organisation, Edinburgh, SA;Flinders University, Bedford Park, SA;Flinders University, Bedford Park, SA;Flinders University, Bedford Park, SA

  • Venue:
  • Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Empirical studies assessing the effectiveness of novel document interfaces are becoming more prevalent, however relatively little attention has been paid to how such tools could work with less structured documents featuring multiple contributors. Participants in this study used different interfaces to answer questions requiring the exploration of collaborative discourse. User performance was influenced by an interaction of interface, transcript, and question type. Individual differences also impacted on performance with higher education levels and higher general knowledge scores being associated with better task performance. The results also revealed that unnecessary interface functionality can hinder performance.