A resource-allocating network for function interpolation
Neural Computation
A function estimation approach to sequential learning with neural networks
Neural Computation
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Three learning phases for radial-basis-function networks
Neural Networks
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
A new EM-based training algorithm for RBF networks
Neural Networks
An online learning algorithm with dimension selection using minimal hyper basis function networks
Systems and Computers in Japan
Improved GAP-RBF network for classification problems
Neurocomputing
Generalized multiscale radial basis function networks
Neural Networks
A new adaptive merging and growing algorithm for designing artificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Automatica (Journal of IFAC)
Hi-index | 0.00 |
Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and these networks are among the most used neural networks for modeling of various nonlinear problems in engineering. Conventional RBF neuron is usually based on Gaussian type of activation function with single width for each activation function. This feature restricts neuron performance for modeling the complex nonlinear problems. To accommodate limitation of a single scale, this paper presents neural network with similar but yet different activation function-hyper basis function (HBF). The HBF allows different scaling of input dimensions to provide better generalization property when dealing with complex nonlinear problems in engineering practice. The HBF is based on generalization of Gaussian type of neuron that applies Mahalanobis-like distance as a distance metrics between input training sample and prototype vector. Compared to the RBF, the HBF neuron has more parameters to optimize, but HBF neural network needs less number of HBF neurons to memorize relationship between input and output sets in order to achieve good generalization property. However, recent research results of HBF neural network performance have shown that optimal way of constructing this type of neural network is needed; this paper addresses this issue and modifies sequential learning algorithm for HBF neural network that exploits the concept of neuron's significance and allows growing and pruning of HBF neuron during learning process. Extensive experimental study shows that HBF neural network, trained with developed learning algorithm, achieves lower prediction error and more compact neural network.