Effects of Sample Size in Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient progressive sampling
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
The Role of Occam‘s Razor in Knowledge Discovery
Data Mining and Knowledge Discovery
Introduction to Data Mining, (First Edition)
Introduction to Data Mining, (First Edition)
Introduction to Neural Networks for C#, 2nd Edition
Introduction to Neural Networks for C#, 2nd Edition
Data Mining Methods and Models
Data Mining Methods and Models
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Probabilistic neural-network structure determination for pattern classification
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Even though multi-layer perceptrons are known to have good prediction accuracy, the task of deciding proper sample sizes for the neural networks tends to be arbitrary. So, depending on samples the performance of trained multi-layer perceptrons has a tendency to be in fluctuation. As the size of samples grows, the improvement in prediction accuracy becomes better slowly. But we cannot use larger and larger samples, because we have limited computing resources as well as limited training examples. This paper suggests a progressive double sampling technique for better multi-layer perceptrons to cope with the fluctuation of prediction accuracy values that depend on samples as well as the size of samples. Experiments with a couple of data sets showed very promising results.