Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Towards designing artificial neural networks by evolution
Applied Mathematics and Computation - Special issue on articficial life and robotics
Pattern Recognition with Fuzzy Objective Function Algorithms
Pattern Recognition with Fuzzy Objective Function Algorithms
Heterogeneous computing and parallel genetic algorithms
Journal of Parallel and Distributed Computing - Problems in parallel and distributed computing: Solutions based on evolutionary paradigms
Full Automatic ANN Design: A Genetic Approach
IWANN '93 Proceedings of the International Workshop on Artificial Neural Networks: New Trends in Neural Computation
International Journal of Approximate Reasoning
Output value-based initialization for radial basis function neural networks
Neural Processing Letters
Studying possibility in a clustering algorithm for RBFNN design for function approximation
Neural Computing and Applications
Fast learning in networks of locally-tuned processing units
Neural Computation
Fuzzy classifier identification using decision tree and multiobjective evolutionary algorithms
International Journal of Approximate Reasoning
Effective input variable selection for function approximation
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Parallelism and evolutionary algorithms
IEEE Transactions on Evolutionary Computation
Considerations in engineering parallel multiobjective evolutionary algorithms
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
A systematic approach to a self-generating fuzzy rule-table forfunction approximation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Fuzzy sets of rules for system identification
IEEE Transactions on Fuzzy Systems
Self-organized fuzzy system generation from training examples
IEEE Transactions on Fuzzy Systems
Comparison of adaptive methods for function estimation from samples
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Minimising the delta test for variable selection in regression problems
International Journal of High Performance Systems Architecture
Variable selection in a GPU cluster using delta test
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Bi-objective feature selection for discriminant analysis in two-class classification
Knowledge-Based Systems
Parametric and non-parametric feature selection for kidney transplants
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advences in computational intelligence - Volume Part II
Hi-index | 0.01 |
The design of radial basis function neural networks (RBFNNs) still remains as a difficult task when they are applied to classification or to regression problems. The difficulty arises when the parameters that define an RBFNN have to be set, these are: the number of RBFs, the position of their centers and the length of their radii. Another issue that has to be faced when applying these models to real world applications is to select the variables that the RBFNN will use as inputs. The literature presents several methodologies to perform these two tasks separately, however, due to the intrinsic parallelism of the genetic algorithms, a parallel implementation will allow the algorithm proposed in this paper to evolve solutions for both problems at the same time. The parallelization of the algorithm not only consists in the evolution of the two problems but in the specialization of the crossover and mutation operators in order to evolve the different elements to be optimized when designing RBFNNs. The subjacent genetic algorithm is the non-sorting dominated genetic algorithm II (NSGA-II) that helps to keep a balance between the size of the network and its approximation accuracy in order to avoid overfitted networks. Another of the novelties of the proposed algorithm is the incorporation of local search algorithms in three stages of the algorithm: initialization of the population, evolution of the individuals and final optimization of the Pareto front. The initialization of the individuals is performed hybridizing clustering techniques with the mutual information (MI) theory to select the input variables. As the experiments will show, the synergy of the different paradigms and techniques combined by the presented algorithm allow to obtain very accurate models using the most significant input variables.