Usability testing vs. heuristic evaluation: was there a contest?
ACM SIGCHI Bulletin
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods
Enhancing the explanatory power of usability heuristics
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Journal of Educational Multimedia and Hypermedia
Applying HCI to music-related hardware
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Reconditioned merchandise: extended structured report formats in usability inspection
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Using heuristics to evaluate the playability of games
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Playability heuristics for mobile games
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Heuristic evaluations at bell labs: analyses of evaluator overlap and group session
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Better discount evaluation: illustrating how critical parameters support heuristic creation
Interacting with Computers
The explanatory power of playability heuristics
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Heuristic evaluation of programming language features: two parallel programming case studies
Proceedings of the 3rd ACM SIGPLAN workshop on Evaluation and usability of programming languages and tools
Hi-index | 0.00 |
The use of heuristics for the evaluation of interfaces is a well studied area. Currently there appear to be two main research areas in relation to heuristics: the analysis of methods to improve the effectiveness of heuristic evaluations; and the development of new heuristic sets for novel and specialised domains. This paper proposes an evidence based design approach to the development of domain specific heuristics and shows how this method was applied within the context of computer assisted assessment. A corpus of usability problems was created through a series of student surveys, heuristic evaluations, and a review of the literature. This corpus was then used to synthesise a set of domain specific heuristics for evaluating CAA applications. The paper describes the process, and presents a new set of heuristics for evaluating CAA applications.