Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Fast Recognition of Musical Genres Using RBF Networks
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Information Technology in Biomedicine
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Prediction of noisy chaotic time series using an optimal radial basis function neural network
IEEE Transactions on Neural Networks
Face recognition with radial basis function (RBF) neural networks
IEEE Transactions on Neural Networks
RBF neural network center selection based on Fisher ratio class separability measure
IEEE Transactions on Neural Networks
Direction-Dependent Learning Approach for Radial Basis Function Networks
IEEE Transactions on Neural Networks
Higher-Order-Statistics-Based Radial Basis Function Networks for Signal Enhancement
IEEE Transactions on Neural Networks
Hi-index | 0.02 |
This paper proposes a very fast 1-pass-throwaway learning algorithm based on a hyperellipsoidal function that can be translated and rotated to cover the data set during learning process. The translation and rotation of hyperellipsoidal function depends upon the distribution of the data set. In addition, we present versatile elliptic basis function (VEBF) neural network with one hidden layer. The hidden layer is adaptively divided into subhidden layers according to the number of classes of the training data set. Each subhidden layer can be scaled by incrementing a new node to learn new samples during training process. The learning time is O(n), where n is the number of data. The network can independently learn any new incoming datum without involving the previously learned data. There is no need to store all the data in order to mix with the new incoming data during the learning process.