Approximation and radial-basis-function networks
Neural Computation
Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Artificial Intelligence Review - Special issue on lazy learning
Artificial Intelligence Review - Special issue on lazy learning
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Hi-index | 0.00 |
The level of generalization of neural networks is heavily dependent on the quality of the training data. That is, some of the training patterns can be redundant or irrelevant. It has been shown that with careful dynamic selection of training patterns, better generalization performance may be obtained. Nevertheless, generalization is carried out independently of the novel patterns to be approximated. In this paper, we present a learning method that automatically selects the most appropriate training patterns to the new sample to be predicted. The proposed method has been applied to Radial Basis Neural Networks, whose generalization capability is usually very poor. The learning strategy slows down the response of the network in the generalisation phase. However, this does not introduces a significance limitation in the application of the method because of the fast training of Radial Basis Neural Networks.