DIS '97 Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques
Usability engineering: scenario-based development of human-computer interaction
Usability engineering: scenario-based development of human-computer interaction
International Journal of Human-Computer Studies - Notification user interfaces
Usability testing of notification interfaces: are we focused on the best metrics?
ACM-SE 42 Proceedings of the 42nd annual Southeast regional conference
Designing the claims reuse library: validating classification methods for notification systems
ACM-SE 42 Proceedings of the 42nd annual Southeast regional conference
Let's stop pushing the envelope and start addressing it: a reference task agenda for HCI
Human-Computer Interaction
Image is everything: advancing HCI knowledge and interface design using the system image
Proceedings of the 43rd annual Southeast regional conference - Volume 2
Hi-index | 0.00 |
Through the automation of empirical evaluation we hope to alleviate evaluation problems encountered by software designers who are relatively new to the process. Barriers to good empirical evaluation include the tedium of setting up a new test for each project, as well as the time and expertise needed to set up a quality test. We hope to make the evaluation process more accessible to a wider variety of software designers by reducing the time and effort. required for evaluation through the use of a wizard-like system that does not require expertise in evaluation techniques. Implementation is accomplished by utilizing a library of design knowledge in the form of claims to focus the evaluations. User tests were performed to evaluate receptiveness to the software tool as well at the performance of the underlying methods. Results were positive and provide a justification for further research into this area as well as exposing problem areas for improvement.