Good abandonment in mobile and PC internet search
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Click the search button and be happy: evaluating direct and immediate information access
Proceedings of the 20th ACM international conference on Information and knowledge management
Multiple testing in statistical analysis of systems-based information retrieval experiments
ACM Transactions on Information Systems (TOIS)
Exploring semi-automatic nugget extraction for Japanese one click access evaluation
Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval
Hi-index | 0.00 |
The One Click Access Task (1CLICK) of NTCIR requires systems to return a concise multi-document summary of web pages in response to a query which is assumed to have been submitted in a mobile context. Systems are evaluated based on information units (or iUnits), and are required to present important pieces of information first and to minimise the amount of text the user has to read. Using the official Japanese results of the second round of the 1CLICK task from NTCIR-10, we discuss our task setting and evaluation framework. Our analyses show that: (1) Simple baseline methods that leverage search engine snippets or Wikipedia are effective for 'lookup' type queries but not necessarily for other query types; (2) There is still a substantial gap between manual and automatic runs; and (3) Our evaluation metrics are relatively robust to the incompleteness of iUnits.