Cybernetics and Systems Analysis
Minimal Structure of Self-Organizing HCMAC Neural Network Classifier
Neural Processing Letters
Improved MS_CMAC Neural Networks by Integrating a Simplified UFN Model
Neural Processing Letters
Intelligent adaptive control for MIMO uncertain nonlinear systems
Expert Systems with Applications: An International Journal
Adaptive CMAC neural control of chaotic systems with a PI-type learning algorithm
Expert Systems with Applications: An International Journal
Adaptive Growing Quantization for 1D CMAC Network
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
FPGA-implemented adaptive RCMAC design for BLDC motors
ICS'08 Proceedings of the 12th WSEAS international conference on Systems
Standalone CMAC control system with online learning ability
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Adaptive filter design using recurrent cerebellar model articulation controller
IEEE Transactions on Neural Networks
ART-type CMAC network classifier
Neurocomputing
Grey adaptive growing CMAC network
Applied Soft Computing
Hi-index | 0.01 |
A macro structure cerebellar model articulation controller (MS CMAC) was developed by connecting several 1D CMAC in a tree structure, which decomposes a multidimensional problem into a set of 1D subproblems, to reduce the computational complexity in multidimensional CMAC. Additionally, a trapezium scheme is proposed to assist MS CMAC to model nonlinear systems. However, this trapezium scheme cannot perform a real smooth interpolation, and its working parameters are obtained through cross-validation. A quadratic splines scheme is developed herein to replace the trapezium scheme in MS CMAC, named high-order MS CMAC (HMS CMAC). The quadratic splines scheme systematically transforms the stepwise weight contents of CMAC in MS CMAC into smooth weight contents to perform the smooth outputs. Test results affirm that the HMS CMAC has acceptable generalization in continuous function-mapping problems for nonoverlapping association in training instances. Nonoverlapping association in training instances not only significantly reduces the number of training instances needed, but also requires only one learning cycle in the learning stage