Usability inspection methods
On the reliability of usability testing
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Assessing the applicability of the structured expert evaluation method (SEEM) for a wider age group
Proceedings of the 2006 conference on Interaction design and children
How HCI-practitioners want to evaluate their own practice
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Heuristic evaluation: Comparing ways of finding and reporting usability problems
Interacting with Computers
A comparative evaluation of heuristic-based usability inspection methods
CHI '08 Extended Abstracts on Human Factors in Computing Systems
International Journal of Human-Computer Studies
Is the `Figure of Merit' Really That Meritorious?
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
The usability inspection performance of work-domain experts: An empirical study
Interacting with Computers
A structured expert evaluation method for the evaluation of children's computer games
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Empirical validation of a usability inspection method for model-driven Web development
Journal of Systems and Software
Hi-index | 0.04 |
The lack of standard assessment criteria for reliably comparing usability evaluation methods (UEMs) is an important gap in HCI knowledge. Recently, metrics for assessing thoroughness, validity, and effectiveness of UEMs, based on user data, have been proposed to bridge this gap. This paper reports our findings of applying these proposed metrics in a study that compared heuristic evaluation (HE) to HE-Plus (an extended version of HE). Our experiment showed better overlap among the HE-Plus evaluators than the HE evaluators, demonstrating greater reliability of the method. When evaluation data, from testing the usability of the same website, was used in calculating the UEM performance metrics, HE-Plus was found to be a superior method to HE in all assessment criteria with a 17%, 39%, and 67% improvement in the aspects of thoroughness, validity, and effectiveness, respectively. The paper concludes with a discussion concerning the limitations of the effectiveness of the UEM from which the real users' data was obtained.