Perceptrons: expanded edition
Effects of Sample Size in Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Letter Recognition Using Holland-Style Adaptive Classifiers
Machine Learning
Efficient progressive sampling
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
The Role of Occam‘s Razor in Knowledge Discovery
Data Mining and Knowledge Discovery
A Probabilistic Classification System for Predicting the Cellular Localization Sites of Proteins
Proceedings of the Fourth International Conference on Intelligent Systems for Molecular Biology
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Data Mining Methods and Models
Data Mining Methods and Models
Introduction to Data Mining, (First Edition)
Introduction to Data Mining, (First Edition)
Forecasting skewed biased stochastic ozone days: analyses, solutions and beyond
Knowledge and Information Systems
Towards multi-layer perceptron as an evaluator through randomly generated training patterns
AIKED'06 Proceedings of the 5th WSEAS International Conference on Artificial Intelligence, Knowledge Engineering and Data Bases
Automatic diagnosis of pathological voices
SSIP'06 Proceedings of the 6th WSEAS International Conference on Signal, Speech and Image Processing
Introduction to Neural Networks for C#, 2nd Edition
Introduction to Neural Networks for C#, 2nd Edition
Fast training MLP networks with Lo-Shu data sampling
AIKED'09 Proceedings of the 8th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
Handwritten Arabic words recognition using multi layer perceptron and Zernik moments
EC'09 Proceedings of the 10th WSEAS international conference on evolutionary computing
Probabilistic neural-network structure determination for pattern classification
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The task of deciding proper sample sizes for multi-layer perceptrons tends to be arbitrary so that, depending on sample data sets, the performance of trained multi-layer perceptrons has a tendency of some fluctuation. As sample size grows, multi-layer perceptrons have the property that performance in prediction accuracy becomes better slowly with some fluctuation. In order to exploit this property this paper suggests a progressive and repeated sampling technique for better multi-layer perceptrons to cope with the fluctuation of prediction accuracy that depend on samples as well as the size of samples. Experiments with six different data sets in UCI machine learning repository showed very good results.