C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Stochastic Attribute Selection Committees
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Learning probabilistic relational concept descriptions
Learning probabilistic relational concept descriptions
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Improved use of continuous attributes in C4.5
Journal of Artificial Intelligence Research
Efficient discriminative learning of Bayesian network classifier via boosted augmented naive Bayes
ICML '05 Proceedings of the 22nd international conference on Machine learning
Machine Learning
Pairwise fusion matrix for combining classifiers
Pattern Recognition
A hybrid classification method using error pattern modeling
Expert Systems with Applications: An International Journal
On diversity and accuracy of homogeneous and heterogeneous ensembles
International Journal of Hybrid Intelligent Systems
Boosted Bayesian network classifiers
Machine Learning
Multirelational classification: a multiple view approach
Knowledge and Information Systems
Empirical analysis of support vector machine ensemble classifiers
Expert Systems with Applications: An International Journal
Computational Statistics & Data Analysis
Spectrum of variable-random trees
Journal of Artificial Intelligence Research
Artificial Intelligence in Medicine
Exploratory undersampling for class-imbalance learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Testing terrorism theory with data mining
International Journal of Data Analysis Techniques and Strategies
Iterative reordering of rules for building ensembles without relearning
DS'07 Proceedings of the 10th international conference on Discovery science
Selecting useful features for personal credit risk analysis
International Journal of Business Information Systems
Further improving emerging pattern based classifiers via bagging
PAKDD'06 Proceedings of the 10th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Improving a dynamic ensemble selection method based on oracle information
International Journal of Innovative Computing and Applications
AusDM '12 Proceedings of the Tenth Australasian Data Mining Conference - Volume 134
Hi-index | 0.00 |
Ensemble learning strategies, especially Boosting and Bagging decision trees, have demonstrated impressive capacities to improve the prediction accuracy of base learning algorithms. Further gains have been demonstrated by strategies that combine simple ensemble formation approaches. In this paper, we investigate the hypothesis that the improvement in accuracy of multistrategy approaches to ensemble learning is due to an increase in the diversity of ensemble members that are formed. In addition, guided by this hypothesis, we develop three new multistrategy ensemble learning techniques. Experimental results in a wide variety of natural domains suggest that these multistrategy ensemble learning techniques are, on average, more accurate than their component ensemble learning techniques.