A mathematical model of the finding of usability problems
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
Enhancing the explanatory power of usability heuristics
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Testing web sites: five users is nowhere near enough
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Applying user testing data to UEM performance metrics
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Making a difference: a survey of the usability profession in Sweden
Proceedings of the third Nordic conference on Human-computer interaction
Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation
Proceedings of the third Nordic conference on Human-computer interaction
A laboratory method for studying activity awareness
Proceedings of the third Nordic conference on Human-computer interaction
Situating evaluation in scenarios of use
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Convergent usability evaluation: a case study from the EIRS project
CHI '05 Extended Abstracts on Human Factors in Computing Systems
A resource support toolkit (R-IDE): supporting the DECIDE framework
CHINZ '06 Proceedings of the 7th ACM SIGCHI New Zealand chapter's international conference on Computer-human interaction: design centered HCI
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Heuristic evaluation for games: usability principles for video game design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas
ACM Transactions on Computer-Human Interaction (TOCHI)
Evaluating Children's Interactive Products: Principles and Practices for Interaction Designers
Evaluating Children's Interactive Products: Principles and Practices for Interaction Designers
Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload?
Behaviour & Information Technology
Is the `Figure of Merit' Really That Meritorious?
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Better discount evaluation: illustrating how critical parameters support heuristic creation
Interacting with Computers
Interfacing safety and communication breakdowns: situated medical technology design
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: applications and services
Structuring dimensions for collaborative systems evaluation
ACM Computing Surveys (CSUR)
Evaluating the collaborative critique method
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability problem identification in culturally diverse settings
Information Systems Journal
Methods towards API usability: a structural analysis of usability problem categories
HCSE'12 Proceedings of the 4th international conference on Human-Centered Software Engineering
Hi-index | 0.00 |
If you ask someone outside the Human-Computer Interaction (HCI) field about usability, many will mention the "classic" discount methods popularized by Jakob Nielsen and others. Discount methods have the appeal of seeming easy to do, and, more importantly for business, being inexpensive. This is especially attractive to smaller startup companies with low budgets. But are discount methods really too risky to justify the "low" cost? This month's business column authors think so, based on their research and experience. Indeed, they believe that these discount methods may actually backfire and end up discrediting the field. Following a lively discussion on the CHI-WEB listserv, we asked them to explain what they see the risks to be, and what they believe we, as a profession, can and should do about it.--- David Siegel and Susan Dray