Journal of the American Society for Information Science - Special issue: relevance research
A cognitive view of the situational dynamism of user-centered relevance estimation
Journal of the American Society for Information Science - Special issue: relevance research
User-defined relevance criteria: an exploratory study
Journal of the American Society for Information Science - Special issue: relevance research
A cognitive model of document use during a research project. Study I. document selection
Journal of the American Society for Information Science
Users' criteria for relevance evaluation: a cross-situational comparison
Information Processing and Management: an International Journal
Children's relevance criteria and information seeking on electronic resources
Journal of the American Society for Information Science
Journal of the American Society for Information Science and Technology
Ask-an-expert services analysis
Journal of the American Society for Information Science and Technology
Value Added Processes in Information Systems
Value Added Processes in Information Systems
The concept of relevance in IR
Journal of the American Society for Information Science and Technology
Information Processing and Management: an International Journal
Journal of the American Society for Information Science and Technology
A framework to predict the quality of answers with non-textual features
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Relationships between categories of relevance criteria and stage in task completion
Information Processing and Management: an International Journal
Expertise networks in online communities: structure and algorithms
Proceedings of the 16th international conference on World Wide Web
Journal of the American Society for Information Science and Technology
Journal of the American Society for Information Science and Technology
Predictors of answer quality in online Q&A sites
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Knowledge sharing and yahoo answers: everyone knows something
Proceedings of the 17th international conference on World Wide Web
Predicting information seeker satisfaction in community question answering
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Users' relevance criteria for evaluating answers in a social Q&A site
Journal of the American Society for Information Science and Technology
The measurement of user satisfaction with question answering systems
Information and Management
The foundation of the concept of relevance
Journal of the American Society for Information Science and Technology
Evaluating and predicting answer quality in community QA
Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval
A structuration approach to online communities of practice: The case of Q&A communities
Journal of the American Society for Information Science and Technology
Microcollaborations in a social Q&A community
Information Processing and Management: an International Journal
Seekers, sloths and social reference: homework questions submitted to a question-answering community
The New Review of Hypermedia and Multimedia - Special issue: Observing users of digital educational technologies
Predicting web searcher satisfaction with existing community-based answers
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
Journal of the American Society for Information Science and Technology
Hi-index | 0.00 |
Online question-answering (Q&A) services are becoming increasingly popular among information seekers. We divide them into two categories, social Q&A (SQA) and virtual reference (VR), and examine how experts (librarians) and end users (students) evaluate information within both categories. To accomplish this, we first performed an extensive literature review and compiled a list of the aspects found to contribute to a “good” answer. These aspects were divided among three high-level concepts: relevance, quality, and satisfaction. We then interviewed both experts and users, asking them first to reflect on their online Q&A experiences and then comment on our list of aspects. These interviews uncovered two main disparities. One disparity was found between users’ expectations with these services and how information was actually delivered among them, and the other disparity between the perceptions of users and experts with regard to the aforementioned three characteristics of relevance, quality, and satisfaction. Using qualitative analyses of both the interviews and relevant literature, we suggest ways to create better hybrid solutions for online Q&A and to bridge the gap between experts’ and users’ understandings of relevance, quality, and satisfaction, as well as the perceived importance of each in contributing to a good answer. © 2012 Wiley Periodicals, Inc.