An effective sampling scheme for better multi-layer perceptrons

  • Authors:
  • Hyontai Sug

  • Affiliations:
  • Division of Computer and Information Engineering, Dongseo University, Busan, Republic of Korea

  • Venue:
  • AIKED'10 Proceedings of the 9th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Even though multi-layer perceptrons are known to have good prediction accuracy, the task of deciding proper sample sizes for the neural networks tends to be arbitrary. So, depending on samples the performance of trained multi-layer perceptrons has a tendency to be in fluctuation. As the size of samples grows, the improvement in prediction accuracy becomes better slowly. But we cannot use larger and larger samples, because we have limited computing resources as well as limited training examples. This paper suggests a progressive double sampling technique for better multi-layer perceptrons to cope with the fluctuation of prediction accuracy values that depend on samples as well as the size of samples. Experiments with a couple of data sets showed very promising results.