A Direct Link Minimal Resource Allocation Network for Adaptive Noise Cancellation
Neural Processing Letters
Nonlinear System Identification Using Lyapunov Based Fully Tuned Dynamic RBF Networks
Neural Processing Letters
An Adaptive Learning Algorithm Aimed at Improving RBF Network Generalization Ability
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Pattern Recognition and Neural Networks
Machine Learning and Its Applications, Advanced Lectures
Classification of MCA Stenosis in Diabetes by MLP and RBF Neural Network
Journal of Medical Systems
Construction of Robot Intra-modal and Inter-modal Coordination Skills by Developmental Learning
Journal of Intelligent and Robotic Systems
EURASIP Journal on Applied Signal Processing
Improved GAP-RBF network for classification problems
Neurocomputing
Minimal radial basis function network based bus protection system using OCT
ISTASC'05 Proceedings of the 5th WSEAS/IASME International Conference on Systems Theory and Scientific Computation
A nonlinear transversal fuzzy filter with online clustering
ICECS'03 Proceedings of the 2nd WSEAS International Conference on Electronics, Control and Signal Processing
An online self-improved fuzzy filter and its applications
ICECS'03 Proceedings of the 2nd WSEAS International Conference on Electronics, Control and Signal Processing
Minimal Resource Allocation on CAN Bus Using Radial Basis Function Networks
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
Model selection approaches for non-linear system identification: a review
International Journal of Systems Science
Small Number of Hidden Units for ELM with Two-Stage Linear Model
IEICE - Transactions on Information and Systems
A Versatile Hyper-Ellipsoidal Basis Function for Function Approximation in High Dimensional Space
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
An Online Self-constructing Fuzzy Neural Network with Restrictive Growth
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Pipelined Genetic Algorithm Initialized RAN Based RBF Modulation Classifier
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
A sequential learning algorithm for online constructing belief-rule-based systems
Expert Systems with Applications: An International Journal
Evolving logic networks with real-valued inputs for fast incremental learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
A constructive enhancement for online sequential extreme learning machine
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A fast and compact fuzzy neural network for online extraction of fuzzy rules
CCDC'09 Proceedings of the 21st annual international conference on Chinese control and decision conference
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Online training for single hidden-layer Online training for single hidden-layer
CIRA'09 Proceedings of the 8th IEEE international conference on Computational intelligence in robotics and automation
Channel equalization using neural networks: a review
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Incremental learning with multi-level adaptation
Neurocomputing
Classification and retrieval on macroinvertebrate image databases
Computers in Biology and Medicine
A generalized online self-constructing fuzzy neural network
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
A Generalized Ellipsoidal Basis Function Based Online Self-constructing Fuzzy Neural Network
Neural Processing Letters
Regularized online sequential learning algorithm for single-hidden layer feedforward neural networks
Pattern Recognition Letters
A novel RBF neural network with fast training and accurate generalization
CIS'04 Proceedings of the First international conference on Computational and Information Science
Adaptive point-cloud surface interpretation
CGI'06 Proceedings of the 24th international conference on Advances in Computer Graphics
Neuro-controller design for nonlinear fighter aircraft maneuver using fully tuned RBF networks
Automatica (Journal of IFAC)
ABR traffic management using minimal resource allocation (neural) networks
Computer Communications
Agent-Based approach to RBF network training with floating centroids
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part II
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Fixed budget quantized kernel least-mean-square algorithm
Signal Processing
FRAN and RBF-PSO as two components of a hyper framework to recognize protein folds
Computers in Biology and Medicine
An improved learning scheme for extracting t-s fuzzy rules from data samples
ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part II
Hi-index | 0.01 |
Presents a detailed performance analysis of the minimal resource allocation network (M-RAN) learning algorithm, M-RAN is a sequential learning radial basis function neural network which combines the growth criterion of the resource allocating network (RAN) of Platt (1991) with a pruning strategy based on the relative contribution of each hidden unit to the overall network output. The resulting network leads toward a minimal topology for the RAN. The performance of this algorithm is compared with the multilayer feedforward networks (MFNs) trained with 1) a variant of the standard backpropagation algorithm, known as RPROP and 2) the dependence identification (DI) algorithm of Moody and Antsaklis (1996) on several benchmark problems in the function approximation and pattern classification areas. For all these problems, the M-RAN algorithm is shown to realize networks with far fewer hidden neurons with better or same approximation/classification accuracy. Further, the time taken for learning (training) is also considerably shorter as M-RAN does not require repeated presentation of the training data