Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Contender's network, a new competitive-learning scheme
Pattern Recognition Letters
Data equalisation with evidence combination for pattern recognition
Pattern Recognition Letters
NETLAB: algorithms for pattern recognition
NETLAB: algorithms for pattern recognition
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
A new EM-based training algorithm for RBF networks
Neural Networks
Novel Self-Organizing Takagi Sugeno Kang Fuzzy Neural Networks Based on ART-like Clustering
Neural Processing Letters
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Fast learning in networks of locally-tuned processing units
Neural Computation
Rule-base derivation for intensive care ventilator control using ANFIS
Artificial Intelligence in Medicine
Functional equivalence between radial basis function networks and fuzzy inference systems
IEEE Transactions on Neural Networks
A novel associative memory approach to speech enhancement in a vehicular environment
Expert Systems with Applications: An International Journal
Neuro-fuzzy icu ventilator patients modeling
Journal of Computational Methods in Sciences and Engineering
Fuzzy ARTMAP and hybrid evolutionary programming for pattern classification
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Hi-index | 0.01 |
As an extension of the traditional normalized radial basis function (NRBF) model, the extended normalized RBF (ENRBF) model was proposed by Xu [RBF nets, mixture experts, and Bayesian Ying-Yang learning, Neurocomputing 19 (1998) 223-257]. In this paper, we perform a supplementary study on ENRBF with several properly designed experiments and some further theoretical discussions. It is shown that ENRBF is able to efficiently improve the learning accuracies under some circumstances. Moreover, since the ENRBF model is initially proposed for the regression and function approximation problems, a further step is taken in this work to modify the ENRBF model to deal with the classification problems. Both the original ENRBF model and the new proposed ENRBF classifier (ENRBFC) can be viewed as the special cases of the mixture-of-experts (ME) model that is discussed in Xu et al. [An alternative model for mixtures of experts, in: Advances in Neural Information Processing Systems, MIT Press, Cambridge, MA, 1995]. Experimental results show the potentials of ENRBFC compared to some other related classifiers.