Cognitive walkthroughs: a method for theory-based evaluation of user interfaces
International Journal of Man-Machine Studies
Information artisans: patterns of result sharing by information searchers
COCS '93 Proceedings of the conference on Organizational computing systems
Cognitive systems engineering
Usability inspection methods
Scenario-based design: envisioning work and technology in system development
Scenario-based design: envisioning work and technology in system development
Bringing design to software
From reading to retrieval: freeform ink annotations as queries
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
The New Review of Information Behaviour Research
Design at Work: Cooperative Design of Computer Systems
Design at Work: Cooperative Design of Computer Systems
MADCOW: a multimedia digital annotation system
Proceedings of the working conference on Advanced visual interfaces
Sharing encountered information: digital libraries get a social life
Proceedings of the 4th ACM/IEEE-CS joint conference on Digital libraries
Exploring the relationship between personal and public annotations
Proceedings of the 4th ACM/IEEE-CS joint conference on Digital libraries
Supporting personal collections across digital libraries in spatial hypertext
Proceedings of the 4th ACM/IEEE-CS joint conference on Digital libraries
Collaborative Information Retrieval in an information-intensive domain
Information Processing and Management: an International Journal
Annotations as context for searching documents
CoLIS'05 Proceedings of the 5th international conference on Context: conceptions of Library and Information Sciences
Hi-index | 0.00 |
We describe an expert evaluation for user requirement elicitation of an annotation system - The Digital Library Annotation Service, DiLAS, that facilitates collaborative information access and sharing. An analytical evaluation was conducted as a Participatory Group Evaluation, which involved presentation beyond the written papers of the objectives and rationale behind the development of the prototype. The empirical evaluation of DiLAS consisted of two experiments. The first evaluation experiment was a bottom up evaluation of the usability of the interface using a qualitative approach. The second stage of our evaluation moved towards a broader work context with a User and Work Centred Evaluation involving an entire, collaborative task situation, which required knowledge sharing on a common real life work task. This paper describes a first evaluation stage in an iterative evaluation process, and the preliminary result is a set of requirements that will inform the next stage of the DiLAS.