Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability testing vs. heuristic evaluation: was there a contest?
ACM SIGCHI Bulletin
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
What is gained and lost when using evaluation methods other than empirical testing
HCI'92 Proceedings of the conference on People and computers VII
Measuring usability: preference vs. performance
Communications of the ACM
Usability inspection methods
Faster, cheaper!! Are usability inspection methods as effective as empirical testing?
Usability inspection methods
Interaction Design
Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation
Proceedings of the third Nordic conference on Human-computer interaction
A method to standardize usability metrics into a single score
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An eye-tracking methodology for characterizing program comprehension processes
Proceedings of the 2006 symposium on Eye tracking research & applications
The added value of eye tracking in the usability evaluation of a network management tool
SAICSIT '05 Proceedings of the 2005 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries
A comparative study of two usability evaluation methods using a web-based e-learning application
Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Identifying web usability problems from eye-tracking data
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 1
Eye-tracking reveals the personal styles for search result evaluation
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
User experience evaluation metrics for usable accounting tools
SAICSIT '10 Proceedings of the 2010 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists
DPPI '11 Proceedings of the 2011 Conference on Designing Pleasurable Products and Interfaces
How can usability contribute to user experience?: a study in the domain of e-commerce
Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference
The impact of identifier style on effort and comprehension
Empirical Software Engineering
Usability evaluation guidelines for business intelligence applications
Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference
Hi-index | 0.00 |
The strengths and weaknesses of heuristic evaluation have been well researched. Despite known weaknesses, heuristic evaluation is still widely used since formal usability testing (also referred to as empirical user testing) is more costly and time consuming. What has received less attention is the type of information heuristic evaluation conveys in comparison to empirical user testing supported by eye tracking and user observation. If usability methods are combined, it becomes even more important to distinguish the information contribution by each method. This paper investigates the application of two usability evaluation methods, namely heuristic evaluation and empirical user testing supported by eye tracking, to the website of a learning management system with the intent of discovering the difference in the usability information yielded. Heuristic evaluation as an inspection method is accepted to be fundamentally different from empirical user testing. This paper contributes to a deeper understanding of the nature of the differences by identifying the kind of usability problems identified through each method. The findings should be of interest to researchers, designers and usability practitioners involved in website design and evaluation.