A mathematical model of the finding of usability problems
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Automatic Support for Usability Evaluation
IEEE Transactions on Software Engineering
Empirically validated web page design metrics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The state of the art in automating usability evaluation of user interfaces
ACM Computing Surveys (CSUR)
Statistical profiles of highly-rated web sites
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cognitive walkthrough for the web
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Understanding and Restructuring Web Sites with ReWeb
IEEE MultiMedia
The bloodhound project: automating discovery of web usability issues using the InfoScentπ simulator
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DIS '04 Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques
Evolution of web site design patterns
ACM Transactions on Information Systems (TOIS)
Damaged merchandise? a review of experiments that compare usability evaluation methods
Human-Computer Interaction
Flexible reporting for automated usability and accessibility evaluation of web sites
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Support for task modeling: a ”constructive” exploration
EHCI-DSVIS'04 Proceedings of the 2004 international conference on Engineering Human Computer Interaction and Interactive Systems
TADEUS: seamless development of task-based and user-oriented interfaces
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
User experience to improve the usability of a vision-based interface
Interacting with Computers
Hi-index | 0.00 |
For the automated evaluation of interactive software systems a variety of techniques exists. Different backgrounds, various concepts for representation and processing make it difficult for developers (and users) to identify the proper technique for automated evaluation with respect to acknowledged usability principles, such as the suitability for the task. In order to facilitate the selection and application of automated usability-evaluation techniques, we introduce a template for structured documentation and reflection. Enriching traditional schemes it addresses the relationship between usability principles and parameters used for processing. We consider the relation of usability principles to processing schemes to be of major importance, since it not only facilitates the communication between users and designers, but also reveals ways how qualitative attributes can be mapped on to operational structures. If we could utilize that information for design, e.g., for automatically checking specifications or prototypes, interactive-system development could be improved significantly. The proposed template stems from our work in the EU COST action 294 MAUSE (www.cost294.org) targeting towards quality assessment of usability-evaluation methods.