Collecting and categorizing software error data in an industrial environment
Journal of Systems and Software - Special issue on the fifth Minnowbrook workshop on software performance evaluation
Software engineering: a practitioner's approach (2nd ed.)
Software engineering: a practitioner's approach (2nd ed.)
Development of an instrument measuring user satisfaction of the human-computer interface
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Software engineering: the production of quality software (2nd ed.)
Software engineering: the production of quality software (2nd ed.)
Designing interaction
Developing user interfaces: ensuring usability through product & process
Developing user interfaces: ensuring usability through product & process
Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods
Faster, cheaper!! Are usability inspection methods as effective as empirical testing?
Usability inspection methods
Usability problem reports: helping evaluators communicate effectively with developers
Usability inspection methods
Observing, predicting, and analyzing usability problems
Usability inspection methods
CHI '93 INTERACT '93 and CHI '93 Conference Companion on Human Factors in Computing Systems
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Designing the User Interface: Strategies for Effective Human-Computer Interaction
Usability Engineering
Software Engineering
Art of Software Testing
Human-Computer Interaction
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
A Practical Guide to Usability Testing
A Practical Guide to Usability Testing
Engineering the Design of Usable Hypermedia
Empirical Software Engineering
Perspective-based Usability Inspection: An Empirical Validationof Efficacy
Empirical Software Engineering
Defect-Causal Analysis Drives Down Error Rates
IEEE Software
Learning from Our Mistakes with Defect Causal Analysis
IEEE Software
Product usability and process improvement based on usability problem classification
Product usability and process improvement based on usability problem classification
Identifying immediate intention during usability evaluation
Proceedings of the 44th annual Southeast regional conference
Proceedings of the 44th annual Southeast regional conference
Introducing item response theory for measuring usability inspection processes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of techniques for matching of usability problem descriptions
Interacting with Computers
A pattern-based usability inspection method: first empirical performance measures and future issues
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2
User-Centered Design Stories: Real-World UCD Case Studies
User-Centered Design Stories: Real-World UCD Case Studies
Classification and prioritization of usability problems using an augmented classification scheme
Journal of Biomedical Informatics
A Usability Analysis Framework for Healthcare Information Technology
International Journal of Technology Diffusion
Journal of Systems and Software
Hi-index | 0.00 |
Although much can be gained by analyzing usabilityproblems, there is no overall framework in which large sets ofusability problems can be easily classified, compared, and analyzed.Current approaches to problem analysis that focus on identifyingspecific problem characteristics (such as severity or cost-to-fix)do provide additional information to the developer; however,they do not adequately support high-level (global) analysis.High-level approaches to problem analysis depend on the developer/evaluator‘s ability to group problems, yet commonlyused techniques for organizing usability problems are incompleteand /or provide inadequate information for problemcorrection. This paper presents the Usability Problem Taxonomy(UPT), a taxonomic model in which usability problems detectedin graphical user interfaces with textual components are classifiedfrom both an artifact and a task perspective. The UPT was builtempirically using over 400 usability problem descriptions collectedon real-world development projects. The UPT has two componentsand contains 28 categories: 19 are in the artifact componentand nine are in the task component. A study was conducted showingthat problems can be classified reliably using the UPT. Techniquesfor high-level problem analysis are explored using UPT classificationof a set of usability problems detected during an evaluationof a CASE tool. In addition, ways to augment or complement existingproblem analysis strategies using UPT analysis are suggested.A summary of reports from two developers who have used the UPTin the workplace provides anecdotal evidence indicating thatUPT classification has improved problem identification, reporting,analysis, and prioritization prior to correction.