Heuristic evaluation of user interfaces
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cognitive walkthroughs: a method for theory-based evaluation of user interfaces
International Journal of Man-Machine Studies
Low vs. high-fidelity prototyping debate
interactions
A heuristic evaluation of a World Wide Web prototype
interactions
Guidelines for usability testing with children
interactions
Cooperative inquiry: developing new technologies for children with children
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Paper or interactive?: a study of prototyping techniques for ubiquitous computing environments
CHI '03 Extended Abstracts on Human Factors in Computing Systems
What makes things fun to learn? heuristics for designing instructional computer games
SIGSMALL '80 Proceedings of the 3rd ACM SIGSMALL symposium and the first SIGPC symposium on Small systems
Using heuristics to evaluate the playability of games
CHI '04 Extended Abstracts on Human Factors in Computing Systems
interactions - Funology
Usability testing with young children
Proceedings of the 2004 conference on Interaction design and children: building a community
Evaluating Early Prototypes in Context: Trade-offs, Challenges, and Successes
IEEE Pervasive Computing
Playability heuristics for mobile games
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Evaluating experience-focused HCI
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Development and evaluation of the problem identification picture cards method
Cognition, Technology and Work
Validating the Fun Toolkit: an instrument for measuring children’s opinions of technology
Cognition, Technology and Work
Towards a shared definition of user experience
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Behaviour & Information Technology
Sketching User Experiences: Getting the Design Right and the Right Design
Sketching User Experiences: Getting the Design Right and the Right Design
Usability heuristics for networked multiplayer games
Proceedings of the ACM 2009 international conference on Supporting group work
Interactive whiteboards in the living room?: asking children about their technologies
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
Expert review method in game evaluations: comparison of two playability heuristic sets
Proceedings of the 13th International MindTrek Conference: Everyday Life in the Ubiquitous Era
Proceedings of the 9th International Conference on Interaction Design and Children
Comparing user interaction with low and high fidelity prototypes of tabletop surfaces
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
The art of game design: a book of lenses
The art of game design: a book of lenses
Investigating children's opinions of games: Fun Toolkit vs. This or That
Proceedings of the 11th International Conference on Interaction Design and Children
Investigating the fidelity effect when evaluating game prototypes with children
BCS-HCI '13 Proceedings of the 27th International BCS Human Computer Interaction Conference
Hi-index | 0.00 |
There have been a number of studies that have compared evaluation results from prototypes of different fidelities but very few of these are with children. This paper reports a comparative study of three prototypes ranging from low fidelity to high fidelity within the context of mobile games, using a between subject design with 37 participants aged 7 to 9. The children played a matching game on either an iPad, a paper prototype using screen shots of the actual game or a sketched version. Observational data was captured to establish the usability problems, and two tools from the Fun Toolkit were used to measure user experience. The results showed that there was little difference for user experience between the three prototypes and very few usability problems were unique to a specific prototype. The contribution of this paper is that children using low-fidelity prototypes can effectively evaluate games of this genre and style.