Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Deterministic global optimal FNN training algorithms
Neural Networks
Reducing the search space in evolutive design of ARIMA and ANN models for time series prediction
MICAI'10 Proceedings of the 9th Mexican international conference on Artificial intelligence conference on Advances in soft computing: Part II
Learning in the feed-forward random neural network: A critical review
Performance Evaluation
Hi-index | 0.00 |
Combination of backpropagation with global search algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO) has been deployed to improve the efficacy of neural network training. However, those global algorithms suffer the curse of dimensionality. We propose a new approach that focuses on the topology of the solution space. Our method prunes the search space by using the Lipschitzian property of the criterion function. We have developed procedures that efficiently compute local Lipschitz constants over subsets of the weight space. Those Local Lipschitz constants can be used to compute lower bounds on the optimal solution.