A resource-allocating network for function interpolation
Neural Computation
A function estimation approach to sequential learning with neural networks
Neural Computation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Prediction of noisy chaotic time series using an optimal radial basis function neural network
IEEE Transactions on Neural Networks
RBF neural network center selection based on Fisher ratio class separability measure
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a versatile hyper-ellipsoidal basis function for function approximation in a given high dimensional space. This hyper-ellipsoidal basis function can be translated and rotated to cover the data based upon the distribution of data in a given high dimensional space. Based on this function, we propose a one-pass hyper-ellipsoidal learning algorithm for which any new incoming data can be fed for learning without involving the previously learned one. This learning algorithm is used to adjust the parameters of the versatile hyper-ellipsoidal basis function. In addition, we propose the hyper-ellipsoidal basis function (HEBF) neural network that uses the one-pass hyper-ellipsoidal neural learning algorithm. The structure of this neural network is similar to the radial basis function (RBF) neural networks. The hidden neurons in the HEBF neural network can be increased or decreased during learning process. The number of the hidden neurons in the network can be grown based on geometric growth criterion and can be reduced by merging the two hidden neurons into a new hidden neuron based on merging criterion during learning process. The merging process can be done independently without considering the learned data set.