Analyses of multiple evidence combination
Proceedings of the 20th annual international ACM SIGIR conference on Research and development in information retrieval
Relevance score normalization for metasearch
Proceedings of the tenth international conference on Information and knowledge management
A formal approach to score normalization for meta-search
HLT '02 Proceedings of the second international conference on Human Language Technology Research
A signal-to-noise approach to score normalization
Proceedings of the 18th ACM conference on Information and knowledge management
Classification-based resource selection
Proceedings of the 18th ACM conference on Information and knowledge management
ECIR'13 Proceedings of the 35th European conference on Advances in Information Retrieval
Distributed information retrieval and applications
ECIR'13 Proceedings of the 35th European conference on Advances in Information Retrieval
Hi-index | 0.00 |
We give a fresh look into score normalization for merging result-lists, isolating the problem from other components. We focus on three of the simplest, practical, and widely-used linear methods which do not require any training data, i.e. MinMax, Sum, and Z-Score. We provide theoretical arguments on why and when the methods work, and evaluate them experimentally. We find that MinMax is the most robust under many circumstances, and that Sum is - in contrast to previous literature - the worst. Based on the insights gained, we propose another three simple methods which work as good or better than the baselines.