The garden in the machine: the emerging science of artificial life
The garden in the machine: the emerging science of artificial life
Energy dependent adaptation of mutation rates in computer models of evolution
ALIFE Proceedings of the sixth international conference on Artificial life
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Evolving neural networks through augmenting topologies
Evolutionary Computation
Classification by Voting Feature Intervals
ECML '97 Proceedings of the 9th European Conference on Machine Learning
Evolving Neural Networks For The Classification Of Galaxies
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
GANNet: A Genetic Algorithm for Optimizing Topology and Weights in Neural Network Design
IWANN '93 Proceedings of the International Workshop on Artificial Neural Networks: New Trends in Neural Computation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Evolving Complex Neural Networks
AI*IA '07 Proceedings of the 10th Congress of the Italian Association for Artificial Intelligence on AI*IA 2007: Artificial Intelligence and Human-Oriented Computing
Evolving multilayer feedforward neural network using adaptive particle swarm algorithm
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
In this paper we show a preliminary work on evolutionary mutation parameters in order to understand whether it is possible or not to skip mutation parameters tuning. In particular, rather than considering mutation parameters as global environmental features, we regard them as endogenous features of the individuals by putting them directly in the genotype. In this way we let the optimal values emerge from the evolutionary process itself. As case study, we apply the proposed methodology to the training of feed-forward neural netwoks on nine classification benchmarks and compare it to other five well established techniques. Results show the effectiveness of the proposed appraoch to get very promising results passing over the boring task of off-line optimal parameters tuning.