Cost/benefit analysis for incorporating human factors in the software lifecycle
Communications of the ACM
Implications of current design practice for the use of HCI techniques
Proceedings of the Fourth Conference of the British Computer Society on People and computers IV
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User interface evaluation in the real world: a comparison of four techniques
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Usability Engineering Life Cycle
Computer
The precis of Project Ernestine or an overview of a validation of GOMS
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods
Usability Engineering
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Reliability of severity estimates for usability problems found by heuristic evaluation
CHI '92 Posters and Short Talks of the 1992 SIGCHI Conference on Human Factors in Computing Systems
Challenges of HCI design and implementation
interactions
Automating interface evaluation
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '95 Conference Companion on Human Factors in Computing Systems
Time affordances: the time factor in diagnostic usability heuristics
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using GOMS for user interface design and evaluation: which technique?
ACM Transactions on Computer-Human Interaction (TOCHI)
Making a difference—the impact of inspections
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Toward a deeper comparison of methods: a reaction to Nielsen & Phillips and new data
CHI '94 Conference Companion on Human Factors in Computing Systems
CHI '94 Conference Companion on Human Factors in Computing Systems
HCI, natural science and design: a framework for triangulation across disciplines
DIS '97 Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques
Optimal amount of time for obtaining accurate usability-test results
Proceedings of the 16th annual international conference on Computer documentation
On the contributions of different empirical data in usability testing
DIS '00 Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques
A comparison of usage evaluation and inspection methods for assessing groupware usability
GROUP '01 Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work
Simple cognitive modeling in a complex cognitive architecture
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The human-computer interaction handbook
Predictive human performance modeling made easy
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Commentary on "Damaged merchandise?"
Human-Computer Interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of playtesting and expert review methods in mobile game evaluation
Proceedings of the 3rd International Conference on Fun and Games
Towards analytical evaluation of human machine interfaces developed in the context of smart homes
Interacting with Computers
User centred approach for home environment designing
Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments
Hi-index | 0.00 |
Two alternative user interface designs were subjected to user testing to measure user performance in a database query task. User performance was also estimated heuristically in three different ways and by use of formal GOMS modelling. The estimated values for absolute user performance had very high variability, but estimates of the relative advantage of the fastest interface were less variable. Choosing the fastest of the two designs would have a net present value more than 1,000 times the cost of getting the estimates. A software manager would make the correct choice every time in our case study if decisions were based on at least three independent estimates. User testing was 4.9 times as expensive as the cheapest heuristic method but provided better performance estimates.