An online framework for supporting the evaluation of personalised information retrieval systems

  • Authors:
  • Catherine Mulwa;Luca Longo;Séamus Lawless;Mary Sharp;Vincent Wade

  • Affiliations:
  • Knowledge and Data Engineering Group, School of Computer Science and Statistics, Trinity College Dublin;Distributed Systems Group, School of Computer Science and Statistics, Trinity College Dublin;Knowledge and Data Engineering Group, School of Computer Science and Statistics, Trinity College Dublin;Knowledge and Data Engineering Group, School of Computer Science and Statistics, Trinity College Dublin;Knowledge and Data Engineering Group, School of Computer Science and Statistics, Trinity College Dublin

  • Venue:
  • iUBICOM'11 Proceedings of the 6th international conference on Ubiquitous and Collaborative Computing
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Scope - Personalised Information Retrieval (PIR) has been gaining attention because it investigates intelligent ways for enhancing content delivery. Web users can have personalised services and more accurate information. Problem - Several PIR systems have been proposed in the literature; however, they have not been properly tested or evaluated. Proposal - The authors propose a generally applicable web-based interface, which provides PIR developers and evaluators with: i) implicit recommendations on how to evaluate a specific PIR system; ii) a repository containing studies on user-centred and layered evaluation studies; iii) recommendations on how to best combine different evaluation methods, metrics and measurement criteria in order to most effectively evaluate their system; iv) a UCE methodology which details how to apply existing UCE techniques; v) a taxonomy of evaluations of adaptive systems; and vi) interface translation support (49 languages supported).