Machine learning an artificial intelligence approach volume II
Machine learning an artificial intelligence approach volume II
Rule induction with CN2: some recent improvements
EWSL-91 Proceedings of the European working session on learning on Machine learning
Using Genetic Algorithms for Concept Learning
Machine Learning - Special issue on genetic algorithms
Machine Learning
Machine Learning
When Is ''Nearest Neighbor'' Meaningful?
ICDT '99 Proceedings of the 7th International Conference on Database Theory
The Coevolution of Antibodies for Concept Learning
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Brief Communication: Finding rule groups to classify high dimensional gene expression datasets
Computational Biology and Chemistry
Hi-index | 0.00 |
Inducing general functions from specific training examples is a central problem in the machine learning. Using sets of If-then rules is the most expressive and readable manner. To find If-then rules, many induction algorithms such as ID3, AQ, CN2 and their variants, were proposed. Sequential covering is the kernel technique of them. To avoid testing all possible selectors, Entropy gain is used to select the best attribute in ID3. Constraint of the size of star was introduced in AQ and beam search was adopted in CN2. These methods speed up their induction algorithms but many good selectors are filtered out. In this work, we introduce a new induction algorithm that is based on enumeration of all possible selectors. Contrary to the previous works, we use pruning power to reduce irrelative selectors. But we can guarantee that no good selectors are filtered out. Comparing with other techniques, the experiment results demonstrate that the rules produced by our induction algorithm have high consistency and simplicity.