The Strength of Weak Learnability
Machine Learning
“Fast learning in multi-resolution hierarchies”
Advances in neural information processing systems 1
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Machine Learning
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Decision trees can initialize radial-basis function networks
IEEE Transactions on Neural Networks
Meta-classifiers and selective superiority
IEA/AIE '00 Proceedings of the 13th international conference on Industrial and engineering applications of artificial intelligence and expert systems: Intelligent problem solving: methodologies and approaches
Designing neural network architectures for pattern recognition
The Knowledge Engineering Review
A reduction technique for nearest-neighbor classification: Small groups of examples
Intelligent Data Analysis
Hi-index | 0.00 |
An important research issue in RBF networks is how to determine the gaussian centers of the radial-basis functions. We investigate a technique that identifies these centers with carefully selected training examples, with the objective to minimize the network's size. The essence is to select three very small subsets rather than one larger subset whose size would exceed the size of the three small subsets unified. The subsets complement each other in the sense that when used by a nearest-neighbor classifier, each of them incurs errors in a different part of the instance space. The paper describes the example-selection algorithm and shows, experimentally, its merits in the design of RBF networks.