Defining operationality for explanation-based learning
Artificial Intelligence
Generalizing the structure of explanations in explanation-based learning
Generalizing the structure of explanations in explanation-based learning
A Methodology for LISP Program Construction from Examples
Journal of the ACM (JACM)
Learning teleoreactive logic programs from problem solving
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Acquisition of hierarchical reactive skills in a unified cognitive architecture
Cognitive Systems Research
Hi-index | 0.00 |
Explanation-based generalization algorithms need to generalize the structure of their explanations. This is necessary in order to acquire concepts where a recursive or iterative process is implicitly represented in the explanation by a fixed number of applications. The fully-implemented BAGGER2 system generalizes explanation structures and produces recursive concepts when warranted. Otherwise the same result as standard explanation-based generalization algorithms is produced. BAGGER2'S generalization algorithm is presented and empirical results that demonstrate the value of acquiring recursive concepts are reported. These experimental results indicate that generalizing explanation structures helps avoid the recently reported negative effects of learning. The advantages of the new approach over previous approaches that generalize explanation structures are described.