Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Findings from observational studies of collaborative work
International Journal of Man-Machine Studies - Computer-supported cooperative work and groupware. Part 1
Cognitive walkthroughs: a method for theory-based evaluation of user interfaces
International Journal of Man-Machine Studies
Usability inspection methods
The pluralistic usability walkthrough: coordinated empathies
Usability inspection methods
Inspections and design reviews: framework, history and reflection
Usability inspection methods
A comparison of usage evaluation and inspection methods for assessing groupware usability
GROUP '01 Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work
Groupware walkthrough: adding context to groupware usability evaluation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Task Analysis for Human-Computer Interaction
Task Analysis for Human-Computer Interaction
Empirical development of a heuristic evaluation methodology for shared workspace groupware
CSCW '02 Proceedings of the 2002 ACM conference on Computer supported cooperative work
Interface-Walkthroughs: Efficient Collaborative Testing
IEEE Software
WETICE '00 Proceedings of the 9th IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises
How people use orientation on tables: comprehension, coordination and communication
GROUP '03 Proceedings of the 2003 international ACM SIGGROUP conference on Supporting group work
ACM Transactions on Computer-Human Interaction (TOCHI)
Territoriality in collaborative tabletop workspaces
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Developing and Evaluating a Meeting Assistant Test Bed
MLMI '08 Proceedings of the 5th international workshop on Machine Learning for Multimodal Interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Investigating teamwork and taskwork in single- and multi-display groupware systems
Personal and Ubiquitous Computing
Physical interfaces for tabletop games
Computers in Entertainment (CIE) - SPECIAL ISSUE: Games
Structuring dimensions for collaborative systems evaluation
ACM Computing Surveys (CSUR)
Semiotic analysis of multi-touch interface design: The MuTable case study
International Journal of Human-Computer Studies
Caracterização das adaptações em métodos de avaliação para aplicações colaborativas
Proceedings of the 11th Brazilian Symposium on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Tabletop groupware systems have natural advantages for collaboration, but they present a challenge for application designers because shared work and interaction progress in different ways than in desktop systems. As a result, tabletop systems still have problems with usability. We have developed a usability evaluation technique, T-CUA, that focuses attention on teamwork issues and that can help designers determine whether prototypes provide adequate support for the basic actions and interactions that are fundamental to table-based collaboration. We compared T-CUA with expert review in a user study where 12 evaluators assessed an early tabletop prototype using one of the two evaluation methods. The group using T-CUA found more teamwork problems and found problems in more areas than those using expert review; in addition, participants found T-CUA to be effective and easy to use. The success of T-CUA shows the benefits of using a set of activity primitives as the basis for discount usability techniques.