IR evaluation methods for retrieving highly relevant documents
SIGIR '00 Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval
Guess who?: enriching the social graph through a crowdsourcing game
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Searching the wikipedia with public online search engines
INEX'10 Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval
Guess what? a game for affective annotation of video using crowd sourcing
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Hi-index | 0.00 |
Wikipedia articles are usually accompanied with history pages, categories and talk pages. The meta--data available in these pages can be analyzed to gain a better understanding of the content and quality of the articles. We analyze the quality of search results of the current major Web search engines (Google, Yahoo! and Live) in Wikipedia. We discuss how the rich meta--data available in wiki pages can be used to provide better search results in Wikipedia. We investigate the effect of incorporating the extent of review of an article into ranking of search results. The extent of review is measured by the number of distinct editors who have contributed to the articles and is extracted by processing Wikipedia's history pages. Our experimental results show that re--ranking search results of the three major Web search engines, using the review feature, improves quality of their rankings for Wikipedia--specific searches.