C4.5: programs for machine learning
C4.5: programs for machine learning
Induction of multivariate regression trees for design optimization
AAAI '94 Proceedings of the twelfth national conference on Artificial intelligence (vol. 1)
Machine Learning - Special issue on inductive transfer
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
Machine Learning
Multitask learning
Ensembles of Multi-Objective Decision Trees
ECML '07 Proceedings of the 18th European conference on Machine Learning
ART-Based Neural Networks for Multi-label Classification
IDA '09 Proceedings of the 8th International Symposium on Intelligent Data Analysis: Advances in Intelligent Data Analysis VIII
Predictive maintenance with multi-target classification models
ACIIDS'10 Proceedings of the Second international conference on Intelligent information and database systems: Part II
MulO-AntMiner: a new ant colony algorithm for the multi-objective classification problem
ICCSA'11 Proceedings of the 2011 international conference on Computational science and its applications - Volume Part II
Learning predictive clustering rules
KDID'05 Proceedings of the 4th international conference on Knowledge Discovery in Inductive Databases
Multi-Label Classification Method for Multimedia Tagging
International Journal of Multimedia Data Engineering & Management
Iterative classification for multiple target attributes
Journal of Intelligent Information Systems
Multi-target regression with rule ensembles
The Journal of Machine Learning Research
Hi-index | 0.00 |
This paper presents a novel decision-tree induction for a multi-objective data set, i.e. a data set with a multi-dimensional class. Inductive decision-tree learning is one of the frequently-used methods for a single-objective data set, i.e. a data set with a single-dimensional class. However, in a real data analysis, we usually have multiple objectives, and a classifier which explains them simultaneously would be useful and would exhibit higher readability. A conventional decision-tree inducer requires transformation of a multi-dimensional class into a single-dimensional class, but such a transformation can considerably worsen both accuracy and readability. In order to circumvent this problem we propose a bloomy decision tree which deals with a multi-dimensional class without such transformations. A bloomy decision tree has a set of split nodes each of which splits examples according to their attribute values, and a set of flower nodes each of which predicts a class dimension of examples. A flower node appears not only at the fringe of a tree but also inside a tree. Our pruning is executed during tree construction, and evaluates each class dimension based on CramÉr's V. The proposed method has been implemented as D3-B (Decision tree in Bloom), and tested with eleven data sets. The experiments showed that D3-B has higher accuracies in nine data sets than C4.5 and tied with it in the other two data sets. In terms of readability, D3-B has a smaller number of split nodes in all data sets, and thus outperforms C4.5.