Machine Learning
Additive Groves of Regression Trees
ECML '07 Proceedings of the 18th European conference on Machine Learning
BoltzRank: learning to maximize expected ranking gain
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Learning to Rank for Information Retrieval
Foundations and Trends in Information Retrieval
Bagging gradient-boosted trees for high precision, low variance ranking models
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
Top-N recommendations from implicit feedback leveraging linked open data
Proceedings of the 7th ACM conference on Recommender systems
Hi-index | 0.00 |
In this paper, we introduce a novel machine learning approach for regression based on the idea of combining bagging and boosting that we call BagBoo. Our BagBoo model borrows its high accuracy potential from. Friedman's gradient boosting [2], and high efficiency and scalability through parallelism from Breiman's bagging [1]. We run empirical evaluations on large scale Web ranking data, and demonstrate that BagBoo is not only showing superior relevance than standalone bagging or boosting, but also outperforms most previously published results on these data sets. We also emphasize that BagBoo is intrinsically scalable and parallelizable, allowing us to train order of half a million trees on 200 nodes in 2 hours CPU time and beat all of the competitors in the Internet Mathematics relevance competition sponsored by Yandex and be one of the top algorithms in both tracks of Yahoo ICML-2010 challenge. We conclude the paper by stating that while impressive experimental evaluation results are presented here in the context of regression trees, the hybrid BagBoo model is applicable to other domains, such as classification, and base training models.