Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
Recurrent Neural Networks for Prediction: Learning Algorithms,Architectures and Stability
An information-theoretic landscape analysis of neuro-controlled embodied organisms
Neural Computing and Applications
Multi-Objective Machine Learning (Studies in Computational Intelligence) (Studies in Computational Intelligence)
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Multiobjectivity and Complexity in Embodied Cognition
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The simultaneous topology optimization and training of neural networks is a problem widely studied in the last years, specially for feedforward models. In the case of recurrent neural networks, the existing proposals attempt to only optimize the number of hidden units, since the problem of topology optimization is more difficult due to the feedback connections in the network structure. In this work, we make a study of the effects and difficulties for the optimization of network connections, hidden neurons and network training for dynamical recurrent models. In the experimental section, the proposal is tested in time series prediction problems.