Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Further results on the margin distribution
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Prediction games and arcing algorithms
Neural Computation
Machine Learning
Linear Programming Boosting via Column Generation
Machine Learning
An introduction to boosting and leveraging
Advanced lectures on machine learning
Convex Optimization
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Totally corrective boosting algorithms that maximize the margin
ICML '06 Proceedings of the 23rd international conference on Machine learning
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
Selected Topics in Column Generation
Operations Research
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
On the Dual Formulation of Boosting Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
RAMOBoost: ranked minority oversampling in boosting
IEEE Transactions on Neural Networks
Margin distribution based bagging pruning
Neurocomputing
Margin optimization based pruning for random forest
Neurocomputing
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part I
Smoothed emphasis for boosting ensembles
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
On the doubt about margin explanation of boosting
Artificial Intelligence
Fully corrective boosting with arbitrary loss and regularization
Neural Networks
Hi-index | 0.00 |
Boosting has been of great interest recently in the machine learning community because of the impressive performance for classification and regression problems. The success of boosting algorithms may be interpreted in terms of the margin theory. Recently, it has been shown that generalization error of classifiers can be obtained by explicitly taking the margin distribution of the training data into account. Most of the current boosting algorithms in practice usually optimize a convex loss function and do not make use of the margin distribution. In this brief, we design a new boosting algorithm, termed margin-distribution boosting (MDBoost), which directly maximizes the average margin and minimizes the margin variance at the same time. This way the margin distribution is optimized. A totally corrective optimization algorithm based on column generation is proposed to implement MDBoost. Experiments on various data sets show that MDBoost outperforms AdaBoost and LPBoost in most cases.