Report on INEX 2010

  • Authors:
  • D. Alexander;P. Arvola;T. Beckers;P. Bellot;T. Chappell;C. M. DeVries;A. Doucet;N. Fuhr;S. Geva;J. Kamps;G. Kazai;M. Koolen;S. Kutty;M. Landoni;V. Moriceau;R. Nayak;R. Nordlie;N. Pharo;E. SanJuan;R. Schenkel;A. Tagarelli;X. Tannier;J. A. Thom;A. Trotman;J. Vainio;Q. Wang;C. Wu

  • Affiliations:
  • -;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-;-

  • Venue:
  • ACM SIGIR Forum
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2010 evaluation campaign, which consisted of a wide range of tracks: Ad Hoc, Book, Data Centric, Interactive, QA, Link the Wiki, Relevance Feedback, Web Service Discovery and XML Mining.