Improving a human-computer dialogue
Communications of the ACM
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A mathematical model of the finding of usability problems
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Usability inspection methods
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Usability Engineering
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
The human-computer interaction handbook
The Damage Index: an aggregation tool for usability problem prioritisation
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Hi-index | 0.00 |
In this paper we describe a new way to perform heuristic evaluations, which allows multiple evaluators to easily compare and combine the results of their reviews. This method was developed to provide a single, reliable, result to the client, but it also allowed us to easily negotiate differences in our findings, and to prioritize usability problems identified by the evaluation. An unexpected side effect is that, by using this evaluation method, the practitioner can measure and predict the effect of usability improvements.