Neural network constructive algorithms: trading generalization for learning efficiency?
Circuits, Systems, and Signal Processing - Special issue: networks for neural processing
Pedagogical pattern selection strategies
Neural Networks
Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Pre-pruning Classification Trees to Reduce Overfitting in Noisy Domains
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
Class Noise vs. Attribute Noise: A Quantitative Study
Artificial Intelligence Review
Pruning Training Sets for Learning of Object Categories
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Generalization and Selection of Examples in Feedforward Neural Networks
Neural Computation
An empirical evaluation of constructive neural network algorithms in classification tasks
International Journal of Innovative Computing and Applications
Neural network architecture selection: size depends on function complexity
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
Constructive neural network algorithms suffer severely from overfitting noisy datasets as, in general, they learn the set of examples until zero error is achieved. We introduce in this work a method for detect and filter noisy examples using a recently proposed constructive neural network algorithm. The method works by exploiting the fact that noisy examples are harder to be learnt, needing a larger number of synaptic weight modifications than normal examples. Different tests are carried out, both with controlled experiments and real benchmark datasets, showing the effectiveness of the approach.