Universal approximation using radial-basis-function networks
Neural Computation
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Pattern Recognition Letters - Special issue on fuzzy set technology in pattern recognition
Software metrics for reliability assessment
Handbook of software reliability engineering
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Dimensionality Reduction in Automatic Knowledge Acquisition: A Simple Greedy Search Approach
IEEE Transactions on Knowledge and Data Engineering
Fuzzy clustering with a knowledge-based guidance
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
Evolutionary optimization of radial basis function classifiers for data mining applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Robust nonlinear model identification methods using forward regression
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Genetically optimized fuzzy polynomial neural networks
IEEE Transactions on Fuzzy Systems
The Development of Incremental Models
IEEE Transactions on Fuzzy Systems
Neural-network design for small training sets of high dimension
IEEE Transactions on Neural Networks
Conditional fuzzy clustering in the design of radial basis function neural networks
IEEE Transactions on Neural Networks
DSP-based hierarchical neural network modulation signal classification
IEEE Transactions on Neural Networks
Multiscale approximation with hierarchical radial basis functions networks
IEEE Transactions on Neural Networks
Robust and adaptive backstepping control for nonlinear systems using RBF neural networks
IEEE Transactions on Neural Networks
High-speed face recognition based on discrete cosine transform and RBF neural networks
IEEE Transactions on Neural Networks
Blind equalization using a predictive radial basis function neural network
IEEE Transactions on Neural Networks
Neuron selection for RBF neural network classifier based on data structure preserving criterion
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Direction-Dependent Learning Approach for Radial Basis Function Networks
IEEE Transactions on Neural Networks
Boolean Factor Analysis by Attractor Neural Network
IEEE Transactions on Neural Networks
Higher-Order-Statistics-Based Radial Basis Function Networks for Signal Enhancement
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A novel training algorithm for RBF neural network using a hybrid fuzzy clustering approach
Fuzzy Sets and Systems
A genetic reduction of feature space in the design of fuzzy models
Applied Soft Computing
Granular fuzzy models: a study in knowledge management in fuzzy modeling
International Journal of Approximate Reasoning
Hi-index | 0.00 |
In this study, we present a new architecture of a granular neural network and provide a comprehensive design methodology as well as elaborate on an algorithmic setup supporting its development. The proposed neural network relates to a broad category of radial basis function neural networks (RBFNNs) in the sense that its topology involves a collection of receptive fields. In contrast to the standard architectures encountered in RBFNNs, here we form individual receptive fields in subspaces of the original input space rather than in the entire input space. These subspaces could be different for different receptive fields. The architecture of the network is fully reflective of the structure encountered in the training data which are granulated with the aid of clustering techniques. More specifically, the output space is granulated with use of K-means clustering while the information granules in the multidimensional input space are formed by using the so-called context-based fuzzy C-means, which takes into account the structure being already formed in the output space. The innovative development facet of the network involves a dynamic reduction of dimensionality of the input space in which the information granules are formed in the subspace of the overall input space which is formed by selecting a suitable subset of input variables so that this subspace retains the structure of the entire space. As this search is of combinatorial character, we use the technique of genetic optimization [genetic algorithms (GAs), to be more specific] to determine the optimal input subspaces. A series of numeric studies exploiting synthetic data and data coming from the Machine Learning Repository, University of California at Irvine, provide a detailed insight into the nature of the algorithm and its parameters as well as offer some comparative analysis.