C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Machine Learning - Special issue on inductive transfer
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
A Brief Introduction to Boosting
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Benefitting from the variables that variable selection discards
The Journal of Machine Learning Research
A Comparison of Decision Tree Ensemble Creation Techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data
The Journal of Machine Learning Research
Top 10 algorithms in data mining
Knowledge and Information Systems
A model of inductive bias learning
Journal of Artificial Intelligence Research
Ensemble learning based on multi-task class labels
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Hi-index | 0.00 |
Many ensemble methods, such as Bagging, Boosting, Random Forest, etc, have been proposed and widely used in real world applications. Some of them are better than others on noise-free data while some of them are better than others on noisy data. But in reality, ensemble methods that can consistently gain good performance in situations with or without noise are more desirable. In this paper, we propose a new method namely MTForest, to ensemble decision tree learning algorihms by enumerating each input attribute as extra task to introduce different additional inductive bias to generate diverse yet accurate component decision tree learning algorithms in the ensemble. The experimental results show that in situations without classification noise, MTForest is comparable to Boosting and Random Forest and significantly better than Bagging, while in situations with classification noise, MTForest is significantly better than Boosting and Random Forest and is slightly better than Bagging. So MTForest is a good choice for ensemble decision tree learning algorithms in situations with or without noise. We conduct the experiments on the basis of 36 widely used UCI data sets that cover a wide range of domains and data characteristics and run all the algorithms within the Weka platform.