Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Optimizing search engines using clickthrough data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Optimally-smooth adaptive boosting and application to agnostic learning
The Journal of Machine Learning Research
Smooth boosting and learning with malicious noise
The Journal of Machine Learning Research
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
Neural Computation
Journal of Artificial Intelligence Research
Robust reductions from ranking to classification
COLT'07 Proceedings of the 20th annual conference on Learning theory
Smooth boosting using an information-based criterion
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Margin-Based ranking meets boosting in the middle
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Approximate reduction from AUC maximization to 1-norm soft margin optimization
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Hi-index | 0.00 |
We propose a new boosting algorithm for bipartite ranking problems. Our boosting algorithm, called SoftRankBoost, is a modification of RankBoost which maintains only smooth distributions over data. SoftRankBoost provably achieves approximately the maximum soft margin over all pairs of positive and negative examples, which implies high AUC score for future data.