Time, relevance and interaction modelling for information retrieval
Proceedings of the 20th annual international ACM SIGIR conference on Research and development in information retrieval
Power browser: efficient Web browsing for PDAs
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
An Evaluation of WebTwig - A Site Outliner for Handheld Web Access
HUC '99 Proceedings of the 1st international symposium on Handheld and Ubiquitous Computing
Using graded relevance assessments in IR evaluation
Journal of the American Society for Information Science and Technology
Providing consistent and exhaustive relevance assessments for XML retrieval evaluation
Proceedings of the thirteenth ACM international conference on Information and knowledge management
Report on the INEX 2004 interactive track
ACM SIGIR Forum
ACM SIGIR Forum
Evaluation in (XML) information retrieval: expected precision-recall with user modelling (EPRUM)
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
eXtended cumulated gain measures for the evaluation of content-oriented XML retrieval
ACM Transactions on Information Systems (TOIS)
Evaluating relevant in context: document retrieval with a twist
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
A new interpretation of average precision
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Locating relevant text within XML documents
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Structural relevance: a common basis for the evaluation of structured document retrieval
Proceedings of the 17th ACM conference on Information and knowledge management
Use of Multiword Terms and Query Expansion for Interactive Information Retrieval
Advances in Focused Retrieval
User behaviour in the context of structured documents
ECIR'03 Proceedings of the 25th European conference on IR research
INEX'05 Proceedings of the 4th international conference on Initiative for the Evaluation of XML Retrieval
Focused access to sparsely and densely relevant documents
Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval
ACM SIGIR Forum
Overview of the INEX 2010 ad hoc track
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
The potential benefit of focused retrieval in relevant-in-context task
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
Time-based calibration of effectiveness measures
SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
Model for simulating result document browsing in focused retrieval
Proceedings of the 4th Information Interaction in Context Symposium
Unobtrusive mobile browsing behaviour tracking tool
Proceedings of the 4th Information Interaction in Context Symposium
Summaries, ranked retrieval and sessions: a unified framework for information access evaluation
Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval
Selection fusion in semi-structured retrieval
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
This study introduces a novel framework for evaluating passage and XML retrieval. The framework focuses on a user's effort to localize relevant content in a result document. Measuring the effort is based on a system guided reading order of documents. The effort is calculated as the quantity of text the user is expected to browse through. More specifically, this study seeks evaluation metrics for retrieval methods following a specific fetch and browse approach, where in the fetch phase documents are ranked in decreasing order according to their document score, like in document retrieval. In the browse phase, for each retrieved document, a set of non-overlapping passages representing the relevant text within the document is retrieved. In other words, the passages of the document are re-organized, so that the best matching passages are read first in sequential order. We introduce an application scenario motivating the framework, and propose sample metrics based on the framework. These metrics give a basis for the comparison of effectiveness between traditional document retrieval and passage/XML retrieval and illuminate the benefit of passage/XML retrieval.