The Strength of Weak Learnability
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Advances in knowledge discovery and data mining
Advances in knowledge discovery and data mining
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Advanced Dynamic Selection of Diagnostic Methods
CBMS '98 Proceedings of the Eleventh IEEE Symposium on Computer-Based Medical Systems
Learning Feature Selection for Medical Databases
CBMS '99 Proceedings of the 12th IEEE Symposium on Computer-Based Medical Systems
Data Mining using MLC++, A Machine Learning Library in C++
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Combining Answers of Sub-classifiers in the Bagging-Feature Ensembles
RSEISP '07 Proceedings of the international conference on Rough Sets and Intelligent Systems Paradigms
Ensemble Learning: A Study on Different Variants of the Dynamic Selection Approach
MLDM '09 Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition
Dynamic integration with random forests
ECML'06 Proceedings of the 17th European conference on Machine Learning
Mixture of random prototype-based local experts
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
Dynamic fusion method using Localized Generalization Error Model
Information Sciences: an International Journal
Local Feature Selection with Dynamic Integration of Classifiers
Fundamenta Informaticae - Intelligent Systems
Hi-index | 0.00 |
One approach in classification tasks is to use machine learning techniques to derive classifiers using learning instances. The co-operation of several base classifiers as a decision committee has succeeded to reduce classification error. The main current decision committee learning approaches boosting and bagging use resampling with the training set and they can be used with different machine learning techniques which derive base classifiers. Boosting uses a kind of weighted voting and bagging uses equal weight voting as a combining method. Both do not take into account the local aspects that the base classifiers may have inside the problem space. We have proposed a dynamic integration technique to be used with ensembles of classifiers. In this paper, the proposed dynamic integration technique is applied with AdaBoost and bagging. The comparison results using several datasets of the UCI machine learning repository show that boosting and bagging with dynamic integration of classifiers results often better accuracy than boosting and bagging result with their original voting techniques.