Communications of the ACM
Inferring decision trees using the minimum description length principle
Information and Computation
The Utility of Knowledge in Inductive Learning
Machine Learning
Compression, significance and accuracy
ML92 Proceedings of the ninth international workshop on Machine learning
An operator-based approach to first-order theory revision
An operator-based approach to first-order theory revision
Learning Logical Definitions from Relations
Machine Learning
Explanation-Based Generalization: A Unifying View
Machine Learning
Explanation-Based Learning: An Alternative View
Machine Learning
Machine Learning
Induction in an abstraction space: a form of constructive induction
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
Learning structural decision trees from examples
IJCAI'91 Proceedings of the 12th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
This paper presents an approach to a new leanring problem, the problem of learning from an approximate theory and a set of noisy examples. This problem requires a new learning approach since it cannot be satisfactorily solved by either indictive, or analytic learning algorithms or their existing combinations. Our approach can be viewed as an extension of the minimum description length (MDL) principle, and is unique in that it is based on the encoding of the refinement required to transform the given theory into a better theory rather than on the encoding of the resultant theory as in traditional MDL. Experimental results show that, based on our approach. the theory learned from an approximate theory and a set of noisy examples is more accnrate than either the approximate theory itself or a theory learned from the examples alone. This suggests that our approach can combine useful iuformation from both the theory and the training set even though both of them are only partially correct.