Let your users do the testing: a comparison of three remote asynchronous usability testing methods

  • Authors:
  • Anders Bruun;Peter Gull;Lene Hofmeister;Jan Stage

  • Affiliations:
  • Mjølner Informatics A/S, Århus N, Denmark;Jyske Bank A/S, Silkeborg, Denmark;Nykredit A/S, Aalborg, Denmark;Aalborg University, Aalborg, Denmark

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

Remote asynchronous usability testing is characterized by both a spatial and temporal separation of users and evaluators. This has the potential both to reduce practical problems with securing user attendance and to allow direct involvement of users in usability testing. In this paper, we report from an empirical study where we systematically compared three methods for remote asynchronous usability testing: user-reported critical incidents, forum-based online reporting and discussion, and diary-based longitudinal user reporting. In addition, conventional laboratory-based think-aloud testing was included as a benchmark for the remote methods. The results show that each remote asynchronous method supports identification of a considerable number of usability problems. Although this is only about half of the problems identified with the conventional method, it requires significantly less time. This makes remote asynchronous methods an appealing possibility for usability testing in many software projects.