Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
MLP in layer-wise form with applications to weight decay
Neural Computation
Evolving neural networks through augmenting topologies
Evolutionary Computation
Speeding up backpropagation using multiobjective evolutionary algorithms
Neural Computation
Robust Formulations for Training Multilayer Perceptrons
Neural Computation
Neural prediction of product quality based on pilot paper machine process measurements
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part I
MICAI'11 Proceedings of the 10th international conference on Artificial Intelligence: advances in Soft Computing - Volume Part II
Hi-index | 0.00 |
The generalization capability of a multilayer perceptron can be adjusted by adding a penalty (weight decay) term to the cost function used in the training process. In this paper we present a possible heuristic method for finding a good coefficient for this regularization term while, at the same time, looking for a well-regularized MLP model. The simple heuristic is based on validation error, but not strictly in the sense of early stopping; instead, we compare different coefficients using a subdivision of the training data for quality evaluation, and in this way we try to find a coefficient that yields good generalization even after a training run that ends up in full convergence to a cost minimum, given a certain accuracy goal. At the time of writing, we are still working on benchmarking and improving the heuristic, published here for the first time.