Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An automated cognitive walkthrough
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Refining the test phase of usability evaluation: how many subjects is enough?
Human Factors - Special issue: measurement in human factors
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
What is gained and lost when using evaluation methods other than empirical testing
HCI'92 Proceedings of the conference on People and computers VII
Usability inspection methods
Enhancing the explanatory power of usability heuristics
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cognitive engineering principles for enhancing human-computer performance
International Journal of Human-Computer Interaction
User analysis in HCI—the historical lessons from individual differences research
International Journal of Human-Computer Studies
A comparison of usability techniques for evaluating design
DIS '97 Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques
Evaluating a multimedia authoring tool
Journal of the American Society for Information Science - Special issue on current research in human-computer interaction
A toolkit for strategic usability: results from workshops, panels, and surveys
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The user action framework: a reliable foundation for usability engineering support tools
International Journal of Human-Computer Studies
Perspective-based Usability Inspection: An Empirical Validationof Efficacy
Empirical Software Engineering
Testing web sites: five users is nowhere near enough
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Heuristic evaluation of ambient displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Analysis of combinatorial user effect in international usability tests
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Applying user testing data to UEM performance metrics
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Reconditioned merchandise: extended structured report formats in usability inspection
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Comparative usability evaluation
Behaviour & Information Technology
Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation
Proceedings of the third Nordic conference on Human-computer interaction
Usability engineering methods for software developers
Communications of the ACM - Interaction design and children
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designers' use of paper and the implications for informal tools
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Better discount evaluation: illustrating how critical parameters support heuristic creation
Interacting with Computers
Architecting for usability: a survey
Journal of Systems and Software
User-centered evaluation of adaptive and adaptable systems: A literature review
The Knowledge Engineering Review
Scenarios in the Heuristic Evaluation of Mobile Devices: Emphasizing the Context of Use
HCD 09 Proceedings of the 1st International Conference on Human Centered Design: Held as Part of HCI International 2009
USAB '09 Proceedings of the 5th Symposium of the Workgroup Human-Computer Interaction and Usability Engineering of the Austrian Computer Society on HCI and Usability for e-Inclusion
The usability inspection performance of work-domain experts: An empirical study
Interacting with Computers
Heuristic evaluation of usability of GeoWeb sites
W2GIS'07 Proceedings of the 7th international conference on Web and wireless geographical information systems
Design and evaluation guidelines for mental health technologies
Interacting with Computers
User experience to improve the usability of a vision-based interface
Interacting with Computers
Evaluating usability of web-based electronic government: users' perspective
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: users and applications - Volume Part IV
Usability evaluation in software development practice
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part IV
The explanatory power of playability heuristics
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Analysis in practical usability evaluation: a survey study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploring the usability of web portals: A Croatian case study
International Journal of Information Management: The Journal for Information Professionals
Empirical validation of a usability inspection method for model-driven Web development
Journal of Systems and Software
Exploitation of heuristics for virtual environments
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Sirius: A heuristic-based framework for measuring web usability adapted to the type of website
Journal of Systems and Software
Usability in designing assistive technology for children with learning disabilities
i-CREATe '11 Proceedings of the 5th International Conference on Rehabilitation Engineering & Assistive Technology
Critérios para Identificação do Foco de Métodos de Avaliação para Sistemas Colaborativos
Proceedings of the X Brazilian Symposium in Collaborative Systems
Hi-index | 0.00 |
Research on heuristic evaluation in recent years has focused on improving its effectiveness and efficiency with respect to user testing. The aim of this paper is to refine a research agenda for comparing and contrasting evaluation methods. To reach this goal, a framework is presented to evaluate the effectiveness of different types of support for structured usability problem reporting. This paper reports on an empirical study of this framework that compares two sets of heuristics, Nielsen's heuristics and the cognitive principles of Gerhardt-Powals, and two media of reporting a usability problem, i.e. either using a web tool or paper. The study found that there were no significant differences between any of the four groups in effectiveness, efficiency and inter-evaluator reliability. A more significant contribution of this research is that the framework used for the experiments proved successful and should be reusable by other researchers because of its thorough structure.