Usability engineering at a discount
Proceedings of the third international conference on human-computer interaction on Designing and using human-computer interfaces and knowledge based systems (2nd ed.)
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A mathematical model of the finding of usability problems
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Usability inspection methods
Usability studies of WWW sites: heuristic evaluation vs. laboratory testing
SIGDOC '97 Proceedings of the 15th annual international conference on Computer documentation
Automatic Support for Usability Evaluation
IEEE Transactions on Software Engineering
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Trouble with Computers: Usefulness, Usability, and Productivity
Trouble with Computers: Usefulness, Usability, and Productivity
Making Computers People-Literate
Making Computers People-Literate
Human-Computer Interaction
Usability Engineering: Process, Products and Examples
Usability Engineering: Process, Products and Examples
Information Foraging Theory: Adaptive Interaction with Information
Information Foraging Theory: Adaptive Interaction with Information
Evaluating usability evaluation methods: criteria, method and a case study
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: interaction design and usability
Interactive Information Retrieval in Digital Environments
Interactive Information Retrieval in Digital Environments
Heuristic evaluation of programming language features: two parallel programming case studies
Proceedings of the 3rd ACM SIGPLAN workshop on Evaluation and usability of programming languages and tools
Hi-index | 0.00 |
Heuristic evaluation is a common technique for assessing usability, but is most often conducted using a team of 3-5 individuals. Our project involved a team of 16 stakeholders assessing usability of a mission-critical decision support system for the US military. Data collected from so many evaluators could easily become overwhelming, so we devised a method to first filter evaluations based on agreement between evaluators, and then further prioritize findings based on their individual Frequency, Impact, and Severity scores. We termed our methodology the `Integrated Stakeholder Usability Evaluation Process,' and believe it will be useful for other researchers conducting similar research involving heuristic evaluations with large groups.