The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
An Adaptive Version of the Boost by Majority Algorithm
Machine Learning
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Efficient Margin Maximizing with Boosting
The Journal of Machine Learning Research
An efficient modified boosting method for solving classification problems
Journal of Computational and Applied Mathematics
AdaBoost with SVM-based component classifiers
Engineering Applications of Artificial Intelligence
Boosting by weighting critical and erroneous samples
Neurocomputing
Boosting through optimization of margin distributions
IEEE Transactions on Neural Networks
Pattern Classification Using Ensemble Methods
Pattern Classification Using Ensemble Methods
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
A Dynamically Adjusted Mixed Emphasis Method for Building Boosting Ensembles
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Real AdaBoost ensembles have exceptional capabilities for successfully solving classification problems. This characteristic comes from progressively constructing learners paying more attention to samples that are difficult to be classified. However, the corresponding emphasis can be excessive. In particular, when the problem to solve is asymmetric or includes imbalanced outliers, even the previously proposed modifications of the basic algorithm are not as effective as desired. In this paper, we introduce a simple modification which uses the neighborhood concept to reduce the above drawbacks. Experimental results confirm the potential of the proposed scheme. The main conclusions of our work and some suggestions for further research along this line close the paper.