Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence
Data Mining Using Dynamically Constructed Recurrent Fuzzy Neural Networks
PAKDD '98 Proceedings of the Second Pacific-Asia Conference on Research and Development in Knowledge Discovery and Data Mining
Data Mining with Computational Intelligence (Advanced Information and Knowledge Processing)
Data Mining with Computational Intelligence (Advanced Information and Knowledge Processing)
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Objective functions for training new hidden units in constructive neural networks
IEEE Transactions on Neural Networks
Kernel orthonormalization in radial basis function neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Lowe [1] proposed that the kernel parameters of a radial basis function (RBF) neural network may first be fixed and the weights of the output layer can then be determined by pseudo-inverse. Jang, Sun, and Mizutani (p.342 [2]) pointed out that this type of two-step training methods can also be used in fuzzy neural networks (FNNs). By extensive computer simulations, we [3] demonstrated that an FNN with randomly fixed membership function parameters (FNN-RM) has faster training and better generalization in comparison to the classical FNN. To provide a theoretical basis for the FNN-RM, we present an intuitive proof of the universal approximation ability of the FNN-RM in this paper, based on the orthogonal set theory proposed by Kaminski and Strumillo for RBF neural networks [4].