Using graded relevance assessments in IR evaluation
Journal of the American Society for Information Science and Technology
Report on the INEX 2004 interactive track
ACM SIGIR Forum
Locating relevant text within XML documents
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Focused Access to XML Documents
Assessors' search result satisfaction associated with relevance in a scientific domain
Proceedings of the third symposium on Information interaction in context
Expected reading effort in focused retrieval evaluation
Information Retrieval
The potential benefit of focused retrieval in relevant-in-context task
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
Summarisation of the logical structure of XML documents
Information Processing and Management: an International Journal
Selection fusion in semi-structured retrieval
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
The Relevant in Context retrieval task is document or article retrieval with a twist, where not only the relevant articles should be retrieved but also the relevant information within each article (captured by a set of XML elements) should be correctly identified. Our main research question is: how to evaluate the Relevant in Context task? We propose a generalized average precision measure that meets two main requirements: i) the score reflects the ranked list of articles inherent in the result list, and at the same time ii) the score also reflects how well the retrieved information per article (i.e., the set of elements) corresponds to the relevant information. The resulting measure was used at INEX 2006.