Boolean Feature Discovery in Empirical Learning
Machine Learning
Incremental learning of rules and Meta-rules
Proceedings of the seventh international conference (1990) on Machine learning
Separate-and-Conquer Rule Learning
Artificial Intelligence Review
Machine Learning
Machine Learning
Machine Learning
Machine Learning
Rule Induction with CN2: Some Recent Improvements
EWSL '91 Proceedings of the European Working Session on Machine Learning
Classification with Intersecting Rules
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
On handling conflicts between rules with numerical features
Proceedings of the 2006 ACM symposium on Applied computing
Multi-class Prediction Using Stochastic Logic Programs
Inductive Logic Programming
Protein fold discovery using stochastic logic programs
Probabilistic inductive logic programming
Hi-index | 0.00 |
When applying an unordered set of classification rules, the rules may assign more than one class to a particular example. Previous methods of resolving such conflicts between rules include using the most frequent class of the examples covered by the conflicting rules (as done in CN2) and using naïve Bayes to calculate the most probable class. An alternative way of solving this problem is presented in this paper: by generating new rules from the examples covered by the conflicting rules. These newly induced rules are then used for classification. Experiments on a number of domains show that this method significantly outperforms both the CN2 approach and naïve Bayes.