Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Communications of the ACM - Special issue Participatory Design
What is gained and lost when using evaluation methods other than empirical testing
HCI'92 Proceedings of the conference on People and computers VII
The pluralistic usability walkthrough: coordinated empathies
Usability inspection methods
Situated evaluation for cooperative systems
CSCW '94 Proceedings of the 1994 ACM conference on Computer supported cooperative work
CHI '95 Conference Companion on Human Factors in Computing Systems
Methods & tools: participatory heuristic evaluation
interactions
The human-computer interaction handbook
Applying user testing data to UEM performance metrics
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation
Proceedings of the third Nordic conference on Human-computer interaction
Survey on the UCD integration in the industry
Proceedings of the third Nordic conference on Human-computer interaction
CHI '05 Extended Abstracts on Human Factors in Computing Systems
A Survey of Controlled Experiments in Software Engineering
IEEE Transactions on Software Engineering
Heuristic evaluation: Comparing ways of finding and reporting usability problems
Interacting with Computers
International Journal of Human-Computer Studies
Supporting novice usability practitioners with usability engineering tools
International Journal of Human-Computer Studies
IEEE Transactions on Software Engineering
Gamers as usability evaluators: a study in the domain of virtual worlds
Proceedings of the 11th Brazilian Symposium on Human Factors in Computing Systems
More testers - The effect of crowd size and time restriction in software testing
Information and Software Technology
Hi-index | 0.00 |
It is a challenge for usability experts to perform usability inspections of interactive systems that are tailored to work-domains of which these experts have little knowledge. To counter this, usability inspections with work-domain experts have been explored, but little empirical research has been reported on these experts' performance as evaluators. The present study compared the performance of work-domain experts and usability experts with respect to validity and thoroughness. The work-domain experts were characterized by high computer experience and low system experience. The usability experts were recruited from different ICT companies. The usability inspection method applied was group-based expert walkthrough; a method particularly developed to support non-usability experts as evaluators. The criterion for performance comparison was established through user tests. Fifteen work-domain experts and 12 usability experts participated in the study. The work-domain experts generated equally valid but less thorough usability inspection results than did the usability experts. This finding implies that work-domain experts may be used as evaluators in usability inspections without compromising validity. Moreover, the usability inspection performance of nominal groups of evaluators was explored. It was found that nominal groups of work-domain experts produced results of similar quality as did nominal groups of usability experts, given that group size is disregarded. This finding may be used as basis for hypotheses in future studies on the usability inspection performance of nominal groups of work-domain experts.