Some inconsistencies and misnomers in probabilistic information retrieval
SIGIR '91 Proceedings of the 14th annual international ACM SIGIR conference on Research and development in information retrieval
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
The maximum entropy principle in information retrieval
Proceedings of the 9th annual international ACM SIGIR conference on Research and development in information retrieval
Testing the maximum entropy principle for information retrieval
Journal of the American Society for Information Science
Information Retrieval
Getting to know you: learning new user preferences in recommender systems
Proceedings of the 7th international conference on Intelligent user interfaces
A survey in indexing and searching XML documents
Journal of the American Society for Information Science and Technology - XML
Discriminative models for information retrieval
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
The maximum entropy method for analyzing retrieval measures
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Consistently estimating the selectivity of conjuncts of predicates
VLDB '05 Proceedings of the 31st international conference on Very large data bases
Consistent selectivity estimation via maximum entropy
The VLDB Journal — The International Journal on Very Large Data Bases
Term context models for information retrieval
CIKM '06 Proceedings of the 15th ACM international conference on Information and knowledge management
Maximum expected F-measure training of logistic regression models
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Uncertainty management in rule-based information extraction systems
Proceedings of the 2009 ACM SIGMOD International Conference on Management of data
Applying maximum entropy to known-item email retrieval
ECIR'08 Proceedings of the IR research, 30th European conference on Advances in information retrieval
Hi-index | 0.00 |
This paper takes a fresh look at modeling approaches to information retrieval that have been the basis of much of the probabilistically motivated IR research over the last 20 years. We shall adopt a subjectivist Bayesian view of probabilities and argue that classical work on probabilistic retrieval is best understood from this perspective. The main focus of the paper will be the ranking formulas corresponding to the Binary Independence Model (BIM), presented originally by Roberston and Sparck Jones [1977] and the Combination Match Model (CMM), developed shortly thereafter by Croft and Harper [1979]. We will show how these same ranking formulas can result from a probabilistic methodology commonly known as Maximum Entropy (MAXENT).