Information geometry of U-Boost and Bregman divergence
Neural Computation
Robustifying AdaBoost by Adding the Naive Error Rate
Neural Computation
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Generalized Bradley-Terry Models and Multi-Class Probability Estimates
The Journal of Machine Learning Research
Integration of Stochastic Models by Minimizing α-Divergence
Neural Computation
Robust boosting algorithm against mislabeling in multiclass problems
Neural Computation
A grouped ranking model for item preference parameter
Neural Computation
Hi-index | 0.00 |
The Bradley-Terry model is a statistical representation for one's preference or ranking data by using pairwise comparison results of items. For estimation of the model, several methods based on the sum of weighted Kullback-Leibler divergences have been proposed from various contexts. The purpose of this letter is to interpret an estimation mechanism of the Bradley-Terry model from the viewpoint of flatness, a fundamental notion used in information geometry. Based on this point of view, a new estimation method is proposed on a framework of the em algorithm. The proposed method is different in its objective function from that of conventional methods, especially in treating unobserved comparisons, and it is consistently interpreted in a probability simplex. An estimation method with weight adaptation is also proposed from a viewpoint of the sensitivity. Experimental results show that the proposed method works appropriately, and weight adaptation improves accuracy of the estimate.