An Iterative Growing and Pruning Algorithm for Classification Tree Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Randomizing Outputs to Increase Prediction Accuracy
Machine Learning
Machine Learning
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Improved use of continuous attributes in C4.5
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Using all data to generate decision tree ensembles
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Using boosting to prune bagging ensembles
Pattern Recognition Letters
Heterogeneous stacking for classification-driven watershed segmentation
EURASIP Journal on Advances in Signal Processing
Class-switching neural network ensembles
Neurocomputing
An experimental comparison of ensemble of classifiers for bankruptcy prediction and credit scoring
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Input Decimated Ensemble based on Neighborhood Preserving Embedding for spectrogram classification
Expert Systems with Applications: An International Journal
Statistical Instance-Based Ensemble Pruning for Multi-class Problems
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Predicting trait impressions of faces using local face recognition techniques
Expert Systems with Applications: An International Journal
Creating ensembles of classifiers via fuzzy clustering and deflection
Fuzzy Sets and Systems
Reduced Reward-punishment editing for building ensembles of classifiers
Expert Systems with Applications: An International Journal
Inference on the prediction of ensembles of infinite size
Pattern Recognition
Building ensembles of neural networks with class-switching
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Empirical models based on features ranking techniques for corporate financial distress prediction
Computers & Mathematics with Applications
Design Ensemble Machine Learning Model for Breast Cancer Diagnosis
Journal of Medical Systems
How large should ensembles of classifiers be?
Pattern Recognition
Hi-index | 0.01 |
Ensembles that combine the decisions of classifiers generated by using perturbed versions of the training set where the classes of the training examples are randomly switched can produce a significant error reduction, provided that large numbers of units and high class switching rates are used. The classifiers generated by this procedure have statistically uncorrelated errors in the training set. Hence, the ensembles they form exhibit a similar dependence of the training error on ensemble size, independently of the classification problem. In particular, for binary classification problems, the classification performance of the ensemble on the training data can be analysed in terms of a Bernoulli process. Experiments on several UCI datasets demonstrate the improvements in classification accuracy that can be obtained using these class-switching ensembles.