Neural Computation
Three learning phases for radial-basis-function networks
Neural Networks
A self-organising network that grows when required
Neural Networks - New developments in self-organizing maps
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Robust radial basis function neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Radial basis function networks (RBF) are efficient general function approximators. They show good generalization performance and they are easy to train. Due to theoretical considerations RBFs commonly use Gaussian activation functions. It has been shown that these tight restrictions on the choice of possible activation functions can be relaxed in practical applications. As an alternative difference of sigmoidal functions (SRBF) have been proposed. SRBFs have an additional parameter which increases the ability of a network node to adapt its shape to input patterns, even in cases where Gaussian functions fail. In this paper we follow the idea of incorporating greater flexibility into radial basis functions. We propose to use splines as localized deformable radial basis functions (DRBF). We present initial results which show that DRBFs can be evaluated more effectively then SRBFs. We show that even with enhanced flexibility the network is easy to train and convergences robustly towards smooth solutions.