Training Data Subdivision and Periodical Rotation in Hybrid Fuzzy Genetics-Based Machine Learning

  • Authors:
  • Hisao Ishibuchi;Shingo Mihara;Yusuke Nojima

  • Affiliations:
  • -;-;-

  • Venue:
  • ICMLA '11 Proceedings of the 2011 10th International Conference on Machine Learning and Applications and Workshops - Volume 01
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have already proposed an idea of simultaneous implementation of population subdivision and training data set subdivision, which leads to significant decrease in computation time of genetics-based machine learning (GBML) for large data sets. In our idea, a population is subdivided into multiple sub-populations as in island models where subdivided training data are rotated over the sub-populations. In this paper, we focus on the effect of training data rotation on the generalization ability and the computation time of our hybrid fuzzy GBML algorithm. First we show parallel distributed implementation of our hybrid fuzzy GBML algorithm. Then we examine the effect of training data rotation through computational experiments where both single-population (i.e., non-parallel) and multi-population (i.e., parallel) versions of our GBML algorithm are applied to a multi-class high-dimensional problem with a large number of training patterns. Experimental results show that training data rotation improves the generalization ability of our GBML algorithm. It is also shown that the population size is more directly related to the computation time than the training data set size.