Instance-Based Learning Algorithms
Machine Learning
A resource-allocating network for function interpolation
Neural Computation
Neural Computation
Artificial Intelligence Review - Special issue on lazy learning
Artificial Intelligence Review - Special issue on lazy learning
An efficient MDL-based construction of RBF networks
Neural Networks
Fast learning in networks of locally-tuned processing units
Neural Computation
Lazy training of radial basis neural networks
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
IEEE Transactions on Neural Networks
Time series prediction evolving Voronoi regions
Applied Intelligence
Hi-index | 0.01 |
Lazy learning methods have been used to deal with problems in which the learning examples are not evenly distributed in the input space. They are based on the selection of a subset of training patterns when a new query is received. Usually, that selection is based on the k closest neighbors and it is a static selection, because the number of patterns selected does not depend on the input space region in which the new query is placed. In this paper, a lazy strategy is applied to train radial basis neural networks. That strategy incorporates a dynamic selection of patterns, and that selection is based on two different kernel functions, the Gaussian and the inverse function. This lazy learning method is compared with the classical lazy machine learning methods and with eagerly trained radial basis neural networks.