Finding usability problems through heuristic evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparison of empirical testing and walkthrough methods in user interface evaluation
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usability inspection methods
The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
Usability engineering: scenario-based development of human-computer interaction
Usability engineering: scenario-based development of human-computer interaction
Usability Engineering
A Pattern Approach to Interaction Design
A Pattern Approach to Interaction Design
Macintosh human interface guidelines
Macintosh human interface guidelines
On electronic annotation and its implementation
Proceedings of the working conference on Advanced visual interfaces
GUI Bloopers 2.0: Common User Interface Design Don'ts and Dos
GUI Bloopers 2.0: Common User Interface Design Don'ts and Dos
LAW '07 Proceedings of the Linguistic Annotation Workshop
Complex linguistic annotation --- no easy way out!: a case from Bangla and Hindi POS labeling tasks
ACL-IJCNLP '09 Proceedings of the Third Linguistic Annotation Workshop
Usability Testing Essentials: Ready, Set...Test!
Usability Testing Essentials: Ready, Set...Test!
Refactoring for Usability in Web Applications
IEEE Software
Corpus clouds - facilitating text analysis by means of visualizations
LTC'09 Proceedings of the 4th conference on Human language technology: challenges for computer science and linguistics
Hi-index | 0.00 |
In this paper we present the results of a heuristic usability evaluation of three annotation tools (GATE, MMAX2 and UAM Corpus-Tool). We describe typical usability problems from two categories: (1) general problems, which arise from a disregard of established best practices and guidelines for user interface (UI) design, and (2) more specific problems, which are closely related to the domain of linguistic annotation. By discussing the domain-specific problems we hope to raise tool developers' awareness for potential problem areas. A set of 28 design recommendations, which describe generic solutions for the identified problems, points toward a structured and systematic collection of usability patterns for linguistic annotation tools.