Communications of the ACM
The Strength of Weak Learnability
Machine Learning
An improved boosting algorithm and its implications on learning complexity
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Adaptive Sampling Methods for Scaling Up Knowledge Discovery Algorithms
Data Mining and Knowledge Discovery
Provably Fast Training Algorithms for Support Vector Machines
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Improving Algorithms for Boosting
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Smooth Boosting and Learning with Malicious Noise
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Finding the most interesting patterns in a database quickly by using sequential sampling
The Journal of Machine Learning Research
Optimally-smooth adaptive boosting and application to agnostic learning
The Journal of Machine Learning Research
Smooth Boosting for Margin-Based Ranking
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
Approximate reduction from AUC maximization to 1-norm soft margin optimization
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Hi-index | 0.00 |
Smooth boosting algorithms are variants of boosting methods which handle only smooth distributions on the data. They are proved to be noise-tolerant and can be used in the “boosting by filtering” scheme, which is suitable for learning over huge data. However, current smooth boosting algorithms have rooms for improvements: Among non-smooth boosting algorithms, real AdaBoost or InfoBoost, can perform more efficiently than typical boosting algorithms by using an information-based criterion for choosing hypotheses. In this paper, we propose a new smooth boosting algorithm with another information-based criterion based on Gini index. we show that it inherits the advantages of two approaches, smooth boosting and information-based approaches.