C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Complexity Measures of Supervised Classification Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Inference for the Generalization Error
Machine Learning
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Machine Learning
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Random feature weights for decision tree ensemble construction
Information Fusion
Meta-data: characterization of input features for meta-learning
MDAI'05 Proceedings of the Second international conference on Modeling Decisions for Artificial Intelligence
Hi-index | 0.00 |
This paper proposes a method for constructing ensembles of decision trees: GRASP Forest. This method uses the metaheuristic GRASP, usually used in optimization problems, to increase the diversity of the ensemble. While Random Forest increases the diversity by randomly choosing a subset of attributes in each tree node, GRASP Forest takes into account all the attributes, the source of randomness in the method is given by the GRASP metaheuristic. Instead of choosing the best attribute from a randomly selected subset of attributes, as Random Forest does, the attribute is randomly chosen from a subset of selected good attributes candidates. Besides the selection of attributes, GRASP is used to select the split value for each numeric attribute. The method is compared to Bagging, Random Forest, Random Subspaces, AdaBoost and MutliBoost, being the results very competitive for the proposed method.