Multidimensional data clustering utilizing hybrid search strategies
Pattern Recognition
Multilayer feedforward networks are universal approximators
Neural Networks
What size net gives valid generalization?
Neural Computation
Small depth polynomial size neural networks
Neural Computation
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Neural network design
Fundamentals of Artificial Neural Networks
Fundamentals of Artificial Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Three learning phases for radial-basis-function networks
Neural Networks
Combining RBF Networks Trained by Different Clustering Techniques
Neural Processing Letters
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Mapping Symbolic Knowledge into Locally Receptive Field Networks
AI*IA '95 Proceedings of the 4th Congress of the Italian Association for Artificial Intelligence on Topics in Artificial Intelligence
Representation with Receptive Fields
Representation with Receptive Fields
Instance-Based Regression by Partitioning Feature Projections
Applied Intelligence
Knowledge-Based Clustering: From Data to Information Granules
Knowledge-Based Clustering: From Data to Information Granules
Adaptive mixtures of local experts
Neural Computation
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Robust radial basis function neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Reducing the time complexity of the fuzzy c-means algorithm
IEEE Transactions on Fuzzy Systems
Least squares quantization in PCM
IEEE Transactions on Information Theory
Capabilities of a four-layered feedforward neural network: four layers versus three
IEEE Transactions on Neural Networks
Conditional fuzzy clustering in the design of radial basis function neural networks
IEEE Transactions on Neural Networks
Analysis of input-output clustering for determining centers of RBFN
IEEE Transactions on Neural Networks
An ART-based construction of RBF networks
IEEE Transactions on Neural Networks
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
Functional equivalence between radial basis function networks and fuzzy inference systems
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Optimal adaptive k-means algorithm with dynamic adjustment of learning rate
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, we propose a new neural network architecture based on a family of referential multilayer perceptrons (RMLPs) that play a role of generalized receptive fields. In contrast to ''standard'' radial basis function (RBF) neural networks, the proposed topology of the network offers a considerable level of flexibility as the resulting receptive fields are highly diversified and capable of adjusting themselves to the characteristics of the locally available experimental data. We discuss in detail a design strategy of the novel architecture that fully exploits the modeling capabilities of the contributing RMLPs. The strategy comprises three phases. In the first phase, we form a ''blueprint'' of the network by employing a specialized version of the commonly encountered fuzzy C-means (FCM) clustering algorithm, namely the conditional (context-based) FCM. In this phase our intent is to generate a collection of information granules (fuzzy sets) in the space of input and output variables, narrowed down to some certain contexts. In the second phase, based upon a global view at the structure, we refine the input-output relationships by engaging a collection of RMLPs where each RMLP is trained by using the subset of data associated with the corresponding context fuzzy set. During training each receptive field focuses on the characteristics of these locally available data and builds a nonlinear mapping in a referential mode. Finally, the connections of the receptive fields are optimized through global minimization of the linear aggregation unit located at the output layer of the overall architecture. We also include a series of numeric experiments involving synthetic and real-world data sets which provide a thorough comparative analysis with standard RBF neural networks.