A comparison of five algorithms for the training of CMAC memories for learning control systems
Automatica (Journal of IFAC)
Original Contribution: The CMAC and a theorem of Kolmogorov
Neural Networks
The nature of statistical learning theory
The nature of statistical learning theory
Support vector machines for dynamic reconstruction of a chaotic system
Advances in kernel methods
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Linear Dependency between epsilon and the Input Noise in epsilon-Support Vector Regression
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Support vector interval regression networks for interval regression analysis
Fuzzy Sets and Systems - Theme: Learning and modeling
Feature extraction and gating techniques for ultrasonic shaft signal classification
Applied Soft Computing
Kernel CMAC With Improved Capability
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Learning convergence of CMAC technique
IEEE Transactions on Neural Networks
Hardware implementation of CMAC neural network with reduced storage requirement
IEEE Transactions on Neural Networks
A self-organizing HCMAC neural-network classifier
IEEE Transactions on Neural Networks
Learning convergence in the cerebellar model articulation controller
IEEE Transactions on Neural Networks
Tikhonov training of the CMAC neural network
IEEE Transactions on Neural Networks
A versatile software tool making best use of sparse data for closed loop process control
Advances in Engineering Software
An SVD-based image watermarking in wavelet domain using SVR and PSO
Applied Soft Computing
Hi-index | 0.00 |
In this study, an approach utilizing support vector regression (SVR) as the learning scheme of a Cerebellar Model Articulation Controller (CMAC) to handle noisy data is proposed. This approach is referred to as SVR-CMAC. Firstly, the memory-associated vector is transformed via the SVR model. Then, the output is computed from the SVR model as a given input of a CMAC. That is, the memory size of the proposed SVR-CMAC depends on the number of support vectors. It is difference from the conventional CMAC and the kernel CMAC that mainly depends on the number of input variables. Secondly, in order to measure the distance between two memory-associated vectors (i.e. unipolar binary input data), the modified Hamming distance is used in the proposed SVR-CMAC. That is, the modified Hamming distance measure is incorporated into the kernel function in the SVR model. Furthermore, the existed SVR software is easily modified to implement the SVR approach with these new Gaussian kernel functions. Besides, some easy approaches to determine the hyperparameters of the proposed SVR-CMAC are also proposed. Consequently, the proposed SVR-CMAC solves once a linearly constrained quadratic programming problem to obtain the final results. However, the final results of the conventional CMAC and the kernel CMAC need to update the weights with iteration. Finally, from the simulation results, the performance of the proposed SVR-CMAC is better than the conventional CMAC and the kernel CMAC for noisy data.