Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Efficient progressive sampling
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Artificial Neural Networks for Document Analysis and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to Data Mining, (First Edition)
Introduction to Data Mining, (First Edition)
Evolutionary optimization of radial basis function classifiers for data mining applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Neural networks have been developed for machine learning and data mining tasks, and because data mining problems contain a large amount of data, sampling is a necessity for the success of the task. Radial basis function networks are one of representative neural network algorithms, and known to have good prediction accuracy in many applications, but it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the neural networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples for the networks, because there is some fluctuation in accuracy depending on the samples. This paper suggests a progressive resampling technique to cope with the situation. The suggestion is proved by experiments with very promising results.