Boolean Feature Discovery in Empirical Learning
Machine Learning
Proceedings of the sixth international workshop on Machine learning
Employing linear regression in regression tree leaves
ECAI '92 Proceedings of the 10th European conference on Artificial intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Theories for mutagenicity: a study in first-order and feature-based induction
Artificial Intelligence - Special volume on empirical methods
Machine Learning - special issue on inductive logic programming
Top-down induction of first-order logical decision trees
Artificial Intelligence
Inductive Logic Programming: Techniques and Applications
Inductive Logic Programming: Techniques and Applications
Discovery of frequent DATALOG patterns
Data Mining and Knowledge Discovery
Machine Learning
Handling Real Numbers in ILP: A Step Towards Better Behavioural Clones (Extended Abstract)
ECML '95 Proceedings of the 8th European Conference on Machine Learning
A Fast, Bottom-Up Decision Tree Pruning Algorithm with Near-Optimal Generalization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
Rule-based machine learning methods for functional prediction
Journal of Artificial Intelligence Research
Covering vs divide-and-conquer for top-down induction of logic programs
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Statistical Relational Learning for Document Mining
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Prediction of Ordinal Classes Using Regression Trees
Fundamenta Informaticae - Intelligent Systems
Constructing a Decision Tree for Graph-Structured Data and its Applications
Fundamenta Informaticae - Advances in Mining Graphs, Trees and Sequences
ReMauve: A Relational Model Tree Learner
Inductive Logic Programming
Labeling Nodes of Automatically Generated Taxonomy for Multi-type Relational Datasets
DaWaK '08 Proceedings of the 10th international conference on Data Warehousing and Knowledge Discovery
Combining Multiple Interrelated Streams for Incremental Clustering
SSDBM 2009 Proceedings of the 21st International Conference on Scientific and Statistical Database Management
Similarity-Based Classification in Relational Databases
Fundamenta Informaticae
A Monte-Carlo AIXI approximation
Journal of Artificial Intelligence Research
Predicate selection for structural decision trees
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Constructing a Decision Tree for Graph-Structured Data and its Applications
Fundamenta Informaticae - Advances in Mining Graphs, Trees and Sequences
Prediction of Ordinal Classes Using Regression Trees
Fundamenta Informaticae - Intelligent Systems
Hi-index | 0.00 |
In this chapter, we present a system that enhances the representational capabilities of decision and regression tree learning by extending it to first-order logic, i.e., relational representations as commonly used in Inductive Logic Programming. We describe an algorithm named Structural Classification and Regression Trees (S-CART), which is capable of inducing first-order trees for both classification and regression problems, i.e., for the prediction of either discrete classes or numerical values. We arrive at this algorithm by a strategy called upgrading-we start from a propositional induction algorithm and turn it into a relational learner by devising suitable extensions of the representation language and the associated algorithms. In particular, we have upgraded CART, the classical method for learning classification and regression trees, to handle relational examples and background knowledge. The system constructs a tree containing a literal (an atomic formula or its negation) or a conjunction of literals in each node, and assigns either a discrete class or a numerical value to each leaf. In addition, we have extended the CART methodology by adding linear regression models to the leaves of the trees; this does not have a counter part in CART, but was inspired by its approach to pruning. The regression variant of S-CART is one of the few systems applicable to Relational Regression problems. Experiments in several real-world domains demonstrate that the approach is useful and competitive with existing methods, indicating that the advantage of relatively small and comprehensible models does not come at the expense of predictive accuracy.