Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
C4.5: programs for machine learning
C4.5: programs for machine learning
Projection pursuit discriminant analysis
Computational Statistics & Data Analysis
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
A comparative study of neural network based feature extraction paradigms
Pattern Recognition Letters
Prediction games and arcing algorithms
Neural Computation
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Improved Generalization Through Explicit Optimization of Margins
Machine Learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Machine Learning
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Option Decision Trees with Majority Votes
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
ALT '96 Proceedings of the 7th International Workshop on Algorithmic Learning Theory
Online Ensemble Learning: An Empirical Study
Machine Learning
Boosting as a Regularized Path to a Maximum Margin Classifier
The Journal of Machine Learning Research
Multi-Output Regularized Feature Projection
IEEE Transactions on Knowledge and Data Engineering
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
CIXL2: a crossover operator for evolutionary algorithms based on population features
Journal of Artificial Intelligence Research
Dynamic projection network for supervised pattern classification
International Journal of Approximate Reasoning
Error bounds for aggressive and conservative AdaBoost
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Evolutionary discriminant analysis
IEEE Transactions on Evolutionary Computation
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Artificial neural networks for feature extraction and multivariate data projection
IEEE Transactions on Neural Networks
AdaBoost classifiers for pecan defect classification
Computers and Electronics in Agriculture
Modular symbiotic adaptive neuro evolution for high dimensionality classificatory problems
Intelligent Decision Technologies
A noise-detection based AdaBoost algorithm for mislabeled data
Pattern Recognition
Breast Cancer Diagnosis Using Optimized Attribute Division in Modular Neural Networks
Journal of Information Technology Research
A remote sensing image classification method based on extreme learning machine ensemble
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
In this paper we present a new approach for boosting methods for the construction of ensembles of classifiers. The approach is based on using the distribution given by the weighting scheme of boosting to construct a non-linear supervised projection of the original variables, instead of using the weights of the instances to train the next classifier. With this method we construct ensembles that are able to achieve a better generalization error and are more robust to noise presence. It has been proved that AdaBoost method is able to improve the margin of the instances achieved by the ensemble. Moreover, its practical success has been partially explained by this margin maximization property. However, in noisy problems, likely to occur in real-world applications, the maximization of the margin of wrong instances or outliers can lead to poor generalization. We propose an alternative approach, where the distribution of the weights given by the boosting algorithm is used to get a supervised projection. Then, the supervised projection is used to train the next classifier using a uniform distribution of the training instances. The proposed approach is compared with three boosting techniques, namely AdaBoost, GentleBoost and MadaBoost, showing an improved performance on a large set of 55 problems from the UCI Machine Learning Repository, and less sensitiveness to noise in the class labels. The behavior of the proposed algorithm in terms of margin distribution and bias-variance decomposition is also studied.