Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Training knowledge-based neural networks to recognize genes in DNA sequences
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Symbolic knowledge and neural networks: insertion, refinement and extraction
Symbolic knowledge and neural networks: insertion, refinement and extraction
Machine Learning
GR2: a hybrid knowledge-based system using general rules
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Biomimetic Neural Learning for Intelligent Robots
Hi-index | 0.00 |
The previously-described KBANN system integrates existing knowledge into neural networks by defining the network topology and setting initial link weights. Standard neural learning techniques can then be used to train such networks, thereby refining the information upon which the network is based. However, standard neural learning techniques are reputed to have difficulty training networks with multiple layers of hidden units; KBANN commonly creates such networks. In addition, standard neural learning techniques ignore some of the information contained in the networks created by KBANN. This paper describes a symbolic inductive learning algorithm for training such networks that uses this previously-ignored information and which helps to address the problems of training "deep" networks. Empirical evidence shows that this method improves not only learning speed, but also the ability of networks to generalize correctly to testing examples.