Inferring decision trees using the minimum description length principle
Information and Computation
C4.5: programs for machine learning
C4.5: programs for machine learning
Optimal network construction by minimum description length
Neural Computation
Knowledge Acquisition from Both Human Expert and Data
PAKDD '01 Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part II
Hi-index | 0.01 |
A Knowledge Acquisition method "Ripple Down Rules" can directly acquire and encode knowledge from human experts. It is an incremental acquisition method and each new piece of knowledge is added as an exception to the existing knowledge base. There is another type of knowledge acquisition method that learns directly from data. Inducion of decision tree is one such representative example. Noting that more data are stored in the database in this digital era, use of both expertise of humans and these stored data becomes even more important. Further, it is not appropriate to assume that the knowledge is stable and maintains its usefulness. Things change over time. It is not good to keep old useless knowledge in the knowledge base when such change happens. This paper attempts to integrate inductive learning and knowledge acquisition under a situation in which we can't assume a stable environment. We show that using the minimum description length principle (MDLP), the knowledge base of Ripple Down Rules is automatically and incrementtally constructed from data. We, thus, can use both human expertise and data simultaneously. When it is found that some change takes place, useless knowledge is automatically deleted based on MDLP, still keeping the consistency of knowledge base. Experiments are carefully designed and tested to verify that the proposed method indeed works for many data sets having different natures.