Time-series forecasting using GA-tuned radial basis functions
Information Sciences—Informatics and Computer Science: An International Journal - Special issue on evolutionary algorithms
Automatic basis selection techniques for RBF networks
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Neural Computing and Applications
Fast learning in networks of locally-tuned processing units
Neural Computation
Genetic algorithm-trained radial basis function neural networks for modelling photovoltaic panels
Engineering Applications of Artificial Intelligence
Learning methods for radial basis function networks
Future Generation Computer Systems
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Evolving space-filling curves to distribute radial basis functions over an input space
IEEE Transactions on Neural Networks
Statistical inference in a redesigned Radial Basis Function neural network
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
The most important factor that governs the performance of a radial basis function network (RBFN) is the optimization of the network architecture, i.e. determining the exact number of radial basis functions (RBFs) in the hidden layer that can best minimize the error between the actual and network outputs. This work presents a genetic algorithm (GA) based evolution of optimal RBFN architecture and compares its performance with the conventional RBFN training procedure employing a two stage methodology, i.e. utilizing the k-means clustering algorithm for the unsupervised training in the first stage, and using linear supervised techniques for subsequent error minimization in the second stage. The validation of the proposed methodology is carried out for the prediction of flank wear in the drilling process following a series of experiments involving high speed steel (HSS) drills for drilling holes on mild-steel workpieces. The genetically grown RBFN not only provides an improved network performance, it is also computationally efficient as it eliminates the need for the error minimization routine in the second stage training of RBFN.