The Strength of Weak Learnability
Machine Learning
An introduction to computational learning theory
An introduction to computational learning theory
Boosting a weak learning algorithm by majority
Information and Computation
Journal of Computational and Applied Mathematics - Special issue in honor of Professor Dr. F. Broeckx
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Handbook of Neural Network Signal Processing
Handbook of Neural Network Signal Processing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Computational Statistics & Data Analysis - Nonlinear methods and data mining
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
An introduction to boosting and leveraging
Advanced lectures on machine learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Empirical analysis of support vector machine ensemble classifiers
Expert Systems with Applications: An International Journal
Research on eye-state based monitoring for drivers' dozing
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
A comparative study on the performance of several ensemble methods with low subsampling ratio
ACIIDS'10 Proceedings of the Second international conference on Intelligent information and database systems: Part II
A D-GMDH model for time series forecasting
Expert Systems with Applications: An International Journal
Diagnosing Breast Masses in Digital Mammography Using Feature Selection and Ensemble Methods
Journal of Medical Systems
Smoothed emphasis for boosting ensembles
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 7.30 |
Based on the Adaboost algorithm, a modified boosting method is proposed in this paper for solving classification problems. This method predicts the class label of an example as the weighted majority voting of an ensemble of classifiers. Each classifier is obtained by applying a given weak learner to a subsample (with size smaller than that of the original training set) which is drawn from the original training set according to the probability distribution maintained over the training set. A parameter is introduced into the reweighted scheme proposed in Adaboost to update the probabilities assigned to training examples so that the algorithm can be more accurate than Adaboost. The experimental results on synthetic and several real-world data sets available from the UCI repository show that the proposed method improves the prediction accuracy, the execution speed as well as the robustness to classification noise of Adaboost. Furthermore, the diversity-accuracy patterns of the ensemble classifiers are investigated by kappa-error diagrams.