The weighted majority algorithm
Information and Computation
An introduction to computational learning theory
An introduction to computational learning theory
Generalization in decision trees and DNF: does size matter?
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
A simple, fast, and effective rule learner
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Machine Learning
Learning with the Set Covering Machine
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Generalization Bounds for Decision Trees
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
A statistical approach to rule learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
SMILE: Sound Multi-agent Incremental LEarning
Proceedings of the 6th international joint conference on Autonomous agents and multiagent systems
Ensembles of Abstaining Classifiers Based on Rule Sets
ISMIS '09 Proceedings of the 18th International Symposium on Foundations of Intelligent Systems
Generalization behaviour of alkemic decision trees
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Modifications of classification strategies in rule set based bagging for imbalanced data
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
IIvotes ensemble for imbalanced data
Intelligent Data Analysis - Combined Learning Methods and Mining Complex Data
Hi-index | 0.00 |
While there is a lot of empirical evidence showing that traditional rule learning approaches work well in practice, it is nearly impossible to derive analytical results about their predictive accuracy. In this paper, we investigate rule-learning from a theoretical perspective. We show that the application of McAllester's PAC-Bayesian bound to rule learning yields a practical learning algorithm, which is based on ensembles of weighted rule sets. Experiments with the resulting learning algorithm show not only that it is competitive with state-of-the-art rule learners, but also that its error rate can often be bounded tightly. In fact, the bound turns out to be tighter than one of the "best" bounds for a practical learning scheme known so far (the Set Covering Machine). Finally, we prove that the bound can be further improved by allowing the learner to abstain from uncertain predictions.