EVA: an experimental video annotator for symbolic analysis of video data
ACM SIGCHI Bulletin
Computer support for transcribing recorded activity
ACM SIGCHI Bulletin
Using video in the BNR usability lab
ACM SIGCHI Bulletin
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EuroPARC's integrated interactive intermedia facility (IIIF): early experiences
Proceedings of the IFIP WG 8.4 confernece on Multi-user interfaces and applications
Integrated data capture and analysis tools for research and testing on graphical user interfaces
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards visual analysis of usability test logs using task models
TAMODIA'06 Proceedings of the 5th international conference on Task models and diagrams for users interface design
Hi-index | 0.00 |
This paper presents a simple but very powerful technique to support user interface evaluation, along with a prototype implementation of this technique. This technique provides tools to allow the user interface evaluator to combine event streams and video recording, analyzing the event stream to search for patterns of interesting or important user actions, then using the recorded timestamps associated with these actions to present only the sections of the video recording of interest. This allows, for example, all places where the user invokes a help system or a particular command to be observed without requiring the evaluator to manually search the recording or sit through long sessions of unrelated interactions. By combining the precise recording of automatic event trace capture with the rich contextual information that can be captured in a video and audio recording, this technique allows analyses to be performed that would not be practical with either media alone.