XML retrieval: what to retrieve?
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Locating relevant text within XML documents
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Overview of the INEX 2009 ad hoc track
INEX'09 Proceedings of the Focused retrieval and evaluation, and 8th international conference on Initiative for the evaluation of XML retrieval
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
A useful method for producing competitive ad hoc task results
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
Overview of the INEX 2010 link the wiki track
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
Hi-index | 0.00 |
This paper analyzes the results of the INEX 2009 Ad Hoc Track, focusing on a variety of topics. First, we examine in detail the relevance judgments. Second, we study the resulting system rankings, for each of the four ad hoc tasks, and determine whether differences between the best scoring participants are statistically significant. Third, we restrict our attention to particular run types: element and passage runs, keyword and phrase query runs, and systems using a reference run with a solid article ranking. Fourth, we examine the relative effectiveness of content only (CO, or Keyword) search as well as content and structure (CAS, or structured) search. Fifth, we look at the ability of focused retrieval techniques to rank articles. Sixth, we study the length of retrieved results, and look at the impact of restricting result length.