Fast learning in networks of locally-tuned processing units
Neural Computation
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
On the construction and training of reformulated radial basis function neural networks
IEEE Transactions on Neural Networks
A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Single-hidden-layer feedforward networks (SLFNs) with radial basis function (RBF) hidden nodes are universal approximators when all the parameters of the networks are allowed adjustable. The learning speed of SLFNs is in general far slower than required and it has been a major bottleneck in their applications for past decades. Huang et al. propose a new learning algorithm called extreme learning machine (ELM) for SLFNs which randomly chooses hidden nodes and analytically determines the output weights. In this paper, common choices of RBF for generating ELM are analyzed and compared. The purpose of this study is to explore comparative strengths and weaknesses of the choices and to show some useful guidelines on how to choose an appropriate RBF hidden nodes for a particular problem.