Feature discovery by competitive learning
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Neural networks and the bias/variance dilemma
Neural Computation
Learning and evolution in neural networks
Adaptive Behavior
Self-organizing maps
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Learning innate face preferences
Neural Computation
Proceedings of the 9th annual conference companion on Genetic and evolutionary computation
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
Self-organization of brain areas in animals begins prenatally, evidently driven by spontaneously generated internal patterns. The neural structures continue to develop postnatally when the sensory systems are exposed to stimuli from the environment. In this process, prenatal training may give the neural system the appropriate bias so that it can learn reliably under changing environmental stimuli. This paper evaluates the hypothesis that an artificial learning system can benefit from a similar approach, consisting of initial training with patterns from an evolved generator followed by training with the actual training set. Competitive learning networks were trained in recognizing handwritten digits in three ways: through environmental learning only, through evolution only, and through prenatal training with evolved pattern generators followed by environmental learning. The results demonstrate that the evolved pattern generator approach leads to better learning performance, suggesting that complex systems can be constructed effectively in this way.