CHILD: A First Step Towards Continual Learning
Machine Learning - Special issue on inductive transfer
Constructive Backpropagation for Recurrent Networks
Neural Processing Letters
Learning the Dynamic Neural Networks with the Improvement of Generalization Capabilities
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
On the Need for a Neural Abstract Machine
Sequence Learning - Paradigms, Algorithms, and Applications
Rules Extraction by Constructive Learning of Neural Networks and Hidden-Unit Clustering
DS '99 Proceedings of the Second International Conference on Discovery Science
Improving generalization capabilities of dynamic neural networks
Neural Computation
Universal Approximation Capability of Cascade Correlation for Structures
Neural Computation
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
An empirical evaluation of constructive neural network algorithms in classification tasks
International Journal of Innovative Computing and Applications
Engineering Applications of Artificial Intelligence
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Ontology alignment using artificial neural network for large-scale ontologies
International Journal of Metadata, Semantics and Ontologies
Hi-index | 0.00 |
It is often difficult to predict the optimal neural network size for a particular application. Constructive or destructive methods that add or subtract neurons, layers, connections, etc. might offer a solution to this problem. We prove that one method, recurrent cascade correlation, due to its topology, has fundamental limitations in representation and thus in its learning capabilities. It cannot represent with monotone (i.e., sigmoid) and hard-threshold activation functions certain finite state automata. We give a “preliminary” approach on how to get around these limitations by devising a simple constructive training method that adds neurons during training while still preserving the powerful fully-recurrent structure. We illustrate this approach by simulations which learn many examples of regular grammars that the recurrent cascade correlation method is unable to learn