Networks of spiking neurons: the third generation of neural network models
Transactions of the Society for Computer Simulation International - Special issue: simulation methodology in transportation systems
Fast computation of RBF coefficients using FFT
Signal Processing - Special section: Distributed source coding
Introducing a very large dataset of handwritten Farsi digits and a study on their varieties
Pattern Recognition Letters
Improved GAP-RBF network for classification problems
Neurocomputing
Farsi font recognition based on Sobel-Roberts features
Pattern Recognition Letters
Letters: Training RBF network to tolerate single node fault
Neurocomputing
Which model to use for cortical spiking neurons?
IEEE Transactions on Neural Networks
Symmetric RBF Classifier for Nonlinear Detection in Multiple-Antenna-Aided Systems
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A novel structure for radial basis function networks is proposed. In this structure, unlike traditional RBF, we set some weights between input and hidden layer. These weights, which take values around unity, are multiplication factors for input vector and perform a linear mapping. Doing this, we increase free parameters of the network, but since these weights are trainable, the overall performance of the network is improved significantly. According to the new weight vector, we called this structure Weighted RBF or WRBF. Weight adjustment formula is provided by applying the gradient descent algorithm. Two classification problems used to evaluate performance of the new RBF network: letter classification using UCI dataset with 16 features, a difficult problem, and digit recognition using HODA dataset with 64 features, an easy problem. WRBF is compared with classic RBF and MLP network, and our experiments show that WRBF outperforms both significantly. For example, in the case of 200 hidden neurons, WRBF achieved recognition rate of 92.78% on UCI dataset while RBF and MLP achieved 83.13 and 89.25% respectively. On HODA dataset, WRBF reached 97.94% recognition rate whereas RBF achieved 97.14%, and MLP accomplished 97.63%.