Citation analysis of database publications
ACM SIGMOD Record
ImageCLEF: Experimental Evaluation in Visual Information Retrieval
ImageCLEF: Experimental Evaluation in Visual Information Retrieval
The scholarly impact of TRECVid (2003–2009)
Journal of the American Society for Information Science and Technology
The CLEF 2005 cross–language image retrieval track
CLEF'05 Proceedings of the 6th international conference on Cross-Language Evalution Forum: accessing Multilingual Information Repositories
Bringing the algorithms to the data: cloud---based benchmarking for medical image analysis
CLEF'12 Proceedings of the Third international conference on Information Access Evaluation: multilinguality, multimodality, and visual analytics
Hi-index | 0.00 |
Systematic evaluation has an important place in information retrieval research starting with the Cranfield tests and currently with TREC (Text REtrieval Conference) and other evaluation campaigns. Such benchmarks are often mentioned to have an important impact in advancing a research field and making techniques comparable. Still, their exact impact is hard to measure. This paper aims at assessing the scholarly impact of the ImageCLEF image retrieval evaluation initiative. To this end, the papers in the proceedings published after each evaluation campaign and their citations are analysed using Scopus and Google Scholar. A significant impact of ImageCLEF could be shown through this bibliometric analysis. The differences between the employed analysis methods, each with its advantages and limitations, are also discussed.