The newspaper as an information exploration metaphor
Information Processing and Management: an International Journal - Special issue on electronic news
Searcher performance in question answering
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
Participatory Design: Principles and Practices
Participatory Design: Principles and Practices
The role of context in question answering systems
CHI '03 Extended Abstracts on Human Factors in Computing Systems
The TREC question answering track
Natural Language Engineering
Context-based question-answering evaluation
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Evaluating the evaluation: a case study using the TREC 2002 question answering track
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
HITIQA: towards analytical question answering
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
A model for quantitative evaluation of an end-to-end question-answering system
Journal of the American Society for Information Science and Technology
A model for quantitative evaluation of an end-to-end question-answering system
Journal of the American Society for Information Science and Technology
Questionnaires for eliciting evaluation data from users of interactive question answering systems
Natural Language Engineering
Methods for Evaluating Interactive Information Retrieval Systems with Users
Foundations and Trends in Information Retrieval
Proceedings of the third symposium on Information interaction in context
High school seniors' social network and other ICT use preferences and concerns
Proceedings of the 73rd ASIS&T Annual Meeting on Navigating Streams in an Information Ecosystem - Volume 47
Crucial web usability factors of 36 industries for students: a large-scale empirical study
Electronic Commerce Research
Applying web usage mining for adaptive intranet navigation
IRFC'11 Proceedings of the Second international conference on Multidisciplinary information retrieval facility
Looking for genre: the use of structural features during search tasks with Wikipedia
Proceedings of the 4th Information Interaction in Context Symposium
International Journal of Electronic Government Research
A user term visualization analysis based on a social question and answer log
Information Processing and Management: an International Journal
You have e-mail, what happens next? Tracking the eyes for genre
Information Processing and Management: an International Journal
Hi-index | 0.00 |
The purpose of this work is to identify potential evaluation criteria for interactive, analytical question-answering (QA) systems by analyzing evaluative comments made by users of such a system. Qualitative data collected from intelligence analysts during interviews and focus groups were analyzed to identify common themes related to performance, use, and usability. These data were collected as part of an intensive, three-day evaluation workshop of the High-Quality Interactive Question Answering (HITIQA) system. Inductive coding and memoing were used to identify and categorize these data. Results suggest potential evaluation criteria for interactive, analytical QA systems, which can be used to guide the development and design of future systems and evaluations. This work contributes to studies of QA systems, information seeking and use behaviors, and interactive searching. © 2007 Wiley Periodicals, Inc.