Heuristic Evaluation of Mission-Critical Software Using a Large Team

  • Authors:
  • Tim Buxton;Alvin Tarrell;Ann Fruhling

  • Affiliations:
  • Peter Kiewit Institute - Office 174C, University of Nebraska Omaha, Omaha NE 68182;Peter Kiewit Institute - Office 174C, University of Nebraska Omaha, Omaha NE 68182;Peter Kiewit Institute - Office 174C, University of Nebraska Omaha, Omaha NE 68182

  • Venue:
  • Proceedings of the 13th International Conference on Human-Computer Interaction. Part IV: Interacting in Various Application Domains
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Heuristic evaluation is a common technique for assessing usability, but is most often conducted using a team of 3-5 individuals. Our project involved a team of 16 stakeholders assessing usability of a mission-critical decision support system for the US military. Data collected from so many evaluators could easily become overwhelming, so we devised a method to first filter evaluations based on agreement between evaluators, and then further prioritize findings based on their individual Frequency, Impact, and Severity scores. We termed our methodology the `Integrated Stakeholder Usability Evaluation Process,' and believe it will be useful for other researchers conducting similar research involving heuristic evaluations with large groups.