Algorithms for better representation and faster learning in radial basis function networks
Advances in neural information processing systems 2
Universal approximation using radial-basis-function networks
Neural Computation
Vector quantization and signal compression
Vector quantization and signal compression
Approximation and radial-basis-function networks
Neural Computation
Radial basis function networks 2: new advances in design
Radial basis function networks 2: new advances in design
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
A new EM-based training algorithm for RBF networks
Neural Networks
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Output value-based initialization for radial basis function neural networks
Neural Processing Letters
Generalized multiscale radial basis function networks
Neural Networks
CASE'09 Proceedings of the fifth annual IEEE international conference on Automation science and engineering
Fast bootstrap methodology for regression model selection
Neurocomputing
Input selection for radial basis function networks by constrained optimization
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
International Journal of Bio-Inspired Computation
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
SpectralCAT: Categorical spectral clustering of numerical and nominal data
Pattern Recognition
Simple estimate of the width in Gaussian kernel with adaptive scaling technique
Applied Soft Computing
A novel training algorithm for RBF neural network using a hybrid fuzzy clustering approach
Fuzzy Sets and Systems
Hi-index | 0.00 |
RBFN (Radial-Basis Function Networks) represent an attractive alternative to other neural network models. Their learning is usually split into an unsupervised part, where center and widths of the basis functions are set, and a linear supervised part for weight computation. Although available literature on RBFN learning widely covers how basis function centers and weights must be set, little effort has been devoted to the learning of basis function widths. This paper addresses this topic: it shows the importance of a proper choice of basis function widths, and how inadequate values can dramatically influence the approximation performances of the RBFN. It also suggests a one-dimensional searching procedure as a compromise between an exhaustive search on all basis function widths, and a non-optimal a priori choice.