Distributed expert-Based information systems: an interdisciplinary approach
Information Processing and Management: an International Journal
Annual review of information science and technology, vol. 22
Where should the person stop and the information search interface start?
Information Processing and Management: an International Journal
Why are online catalogs still hard to use?
Journal of the American Society for Information Science - Special issue: current research in online public access systems
When will information retrieval be "good enough"?
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
The Turn: Integration of Information Seeking and Retrieval in Context (The Information Retrieval Series)
User performance versus precision measures for simple search tasks
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
An analysis of two approaches in information retrieval: From frameworks to study designs
Journal of the American Society for Information Science and Technology
Meeting of the MINDS: an information retrieval research agenda
ACM SIGIR Forum
The good and the bad system: does the test collection predict users' effectiveness?
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
User adaptation: good results from poor systems
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Explaining User Performance in Information Retrieval: Challenges to IR Evaluation
ICTIR '09 Proceedings of the 2nd International Conference on Theory of Information Retrieval: Advances in Information Retrieval Theory
Improvements that don't add up: ad-hoc retrieval results since 1998
Proceedings of the 18th ACM conference on Information and knowledge management
Information interaction in molecular medicine: integrated use of multiple channels
Proceedings of the third symposium on Information interaction in context
Information Retrieval Evaluation
Information Retrieval Evaluation
User-Oriented evaluation in IR
PROMISE'12 Proceedings of the 2012 international conference on Information Retrieval Meets Information Visualization
Proceedings of the 18th Australasian Document Computing Symposium
The neglected user in music information retrieval research
Journal of Intelligent Information Systems
Evaluation in Music Information Retrieval
Journal of Intelligent Information Systems
Hi-index | 0.00 |
The practical goal of information retrieval (IR) research is to create ways to support humans to better access information in order to better carry out their tasks. Because of this, IR research has a primarily technological interest in knowledge creation -- how to interact with information (better)? IR research therefore has a constructive aspect (to create novel systems) and an evaluative aspect (are they any good?). Evaluation of IR effectiveness is guided by a theory on factors that affect effectiveness. True science is about theory development, i.e., understanding and explaining, making hypotheses and testing them. Theories express structured explanatory relationships between variables such as "type of document indexing" and "quality of ranking measured by MAP". Theories are the better, the wider range of phenomena they are able cover accurately. The paper argues that most existing theories of IR are focused on a narrow scope, theories of ranking. To fulfill its task of supporting human information access, theories that go beyond the evaluation of ranking are highly desired but face many challenges. We discuss three additional types of IR theories: theories of searching, theories of information access, and theories of information interaction.