A survey in indexing and searching XML documents
Journal of the American Society for Information Science and Technology - XML
Using graded relevance assessments in IR evaluation
Journal of the American Society for Information Science and Technology
Controlling overlap in content-oriented XML retrieval
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Using RankBoost to compare retrieval systems
Proceedings of the 14th ACM international conference on Information and knowledge management
Investigating the exhaustivity dimension in content-oriented XML element retrieval evaluation
CIKM '06 Proceedings of the 15th ACM international conference on Information and knowledge management
XML search: languages, INEX and scoring
ACM SIGMOD Record
Evaluating XML retrieval effectiveness at INEX
ACM SIGIR Forum
Sound and complete relevance assessment for XML retrieval
ACM Transactions on Information Systems (TOIS)
INEX 2002-2006: understanding XML retrieval evaluation
DELOS'07 Proceedings of the 1st international conference on Digital libraries: research and development
Expected reading effort in focused retrieval evaluation
Information Retrieval
Crowdsourcing for book search evaluation: impact of hit design on comparative system ranking
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
The potential benefit of focused retrieval in relevant-in-context task
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
INEX'05 Proceedings of the 4th international conference on Initiative for the Evaluation of XML Retrieval
Model for simulating result document browsing in focused retrieval
Proceedings of the 4th Information Interaction in Context Symposium
Hi-index | 0.00 |
Comparing retrieval approaches requires test collections, which consist of documents, queries and relevance assessments. Obtaining consistent and exhaustive relevance assessments is crucial for the appropriate comparison of retrieval approaches. Whereas the evaluation methodology for flat text retrieval approaches is well established, the evaluation of XML retrieval approaches is a research issue. This is because XML documents are composed of nested components that cannot be considered independent in terms of relevance. This paper describes the methodology adopted in INEX (the INitiative for the Evaluation of XML Retrieval) to ensure consistent and exhaustive relevance assessments.