Comparative usability evaluation

  • Authors:
  • Rolf Molich;Meghan R. Ede;Klaus Kaasgaard;Barbara Karyukin

  • Affiliations:
  • DialogDesign, Skovkrogen 3, DK-3660 Stenlose, Denmark;Wells Fargo, San Francisco, CA;Yahoo!, Sunnyvale, CA;Xerox Corp., Wilsonville, OR

  • Venue:
  • Behaviour & Information Technology
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper reports on a study assessing the consistency of usability testing across organisations. Nine independent organisations evaluated the usability of the same website, Microsoft Hotmail. The results document a wide difference in selection and application of methodology, resources applied, and problems reported. The organizations reported 310 different usability problems. Only two problems were reported by six or more organizations, while 232 problems (75%) were uniquely reported, that is, no two teams reported the same problem. Some of the unique findings were classified as serious. Even the tasks used by most or all teams produced very different results - around 70% of the findings for each of these tasks were unique. Our main conclusion is that our simple assumption that we are all doing the same and getting the same results in a usability test is plainly wrong.