A heuristic evaluation of a World Wide Web prototype
interactions
The user action framework: a reliable foundation for usability engineering support tools
International Journal of Human-Computer Studies
Usability Engineering
Applying user testing data to UEM performance metrics
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Comparative usability evaluation
Behaviour & Information Technology
The human factors and ergonomics society perspective
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Usability testing: what have we overlooked?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Persona based rapid usability kick-off
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Heat, fire and temperature: the industrial revolution and HCI
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Heuristic Evaluations of Bioinformatics Tools: A Development Case
Proceedings of the 13th International Conference on Human-Computer Interaction. Part I: New Trends
Scenarios in the Heuristic Evaluation of Mobile Devices: Emphasizing the Context of Use
HCD 09 Proceedings of the 1st International Conference on Human Centered Design: Held as Part of HCI International 2009
Is the `Figure of Merit' Really That Meritorious?
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
DPPI '11 Proceedings of the 2011 Conference on Designing Pleasurable Products and Interfaces
A new proposal for improving heuristic evaluation reports performed by novice evaluators
Proceedings of the 2013 Chilean Conference on Human - Computer Interaction
Journal of Systems and Software
Hi-index | 0.00 |
Given that heuristic evaluation (HE) is a popular evaluation method among practitioners despite criticisms surrounding its performance and reliability, there is a need to improve the method's performance. Several studies have shown HE-Plus, an emerging variant of HE, to outperform HE in both effectiveness and reliability. HE-Plus uses the same set of heuristics as HE; the only difference between these two methods is the 'usability problems profile' element in HE-Plus. This paper reports our attempt to verify the original profile employed in HE-Plus based on usability problem classification in the User Action Framework and an experiment evaluating its outcome by comparing HE with two HE variants using a profile (HE-Plus and HE++) and a control group. Our results confirmed the role of the 'usability problems profiles' on improving the performance and reliability of heuristic evaluation: both HE-Plus and HE++ outperformed HE in terms of effectiveness as well as reliability.