Artificial Intelligence
A philosophical basis for knowledge acquisition
Knowledge Acquisition
Using generalised directive models in knowledge acquisition
EKAW'92 Proceedings of the 6th European knowledge acquisition workshop on Current developments in knowledge acquisition
Using explicit ontologies in KBS development
International Journal of Human-Computer Studies
Evaluating knowledge engineering techniques
International Journal of Human-Computer Studies
International Journal of Human-Computer Studies
Incremental acquisition of search knowledge
International Journal of Human-Computer Studies
CommonKADS: A Comprehensive Methodology for KBS Development
IEEE Expert: Intelligent Systems and Their Applications
Creating Semantic Web Contents with Protégé-2000
IEEE Intelligent Systems
Why Evaluate Ontology Technologies? Because It Works!
IEEE Intelligent Systems
A simulation framework for knowledge acquisition evaluation
ACSC '05 Proceedings of the Twenty-eighth Australasian conference on Computer Science - Volume 38
Combining knowledge acquisition and machine learning to control dynamic systems
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Hi-index | 0.00 |
Evaluation of knowledge acquisition (KA) is difficult in general. In recent times, incremental knowledge acquisition that emphasises direct communication between human experts and systems has been increasingly widely used. However, evaluating incremental KA techniques, like KA in general, has been difficult because of the costs of using human expertise in experimental studies. In this paper, we use a general simulation framework to evaluate Ripple Down Rules (RDR), a successful incremental KA method. We focus on two fundamental aspects of incremental KA: the importance of acquiring domain ontological structures and the usage of cornerstone cases.