Ensemble fuzzy rule-based classifier design by parallel distributed fuzzy GBML algorithms
SEAL'12 Proceedings of the 9th international conference on Simulated Evolution and Learning
Efficient training set use for blood pressure prediction in a large scale learning classifier system
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
We have already proposed an idea of simultaneous implementation of population subdivision and training data set subdivision, which leads to significant decrease in computation time of genetics-based machine learning (GBML) for large data sets. In our idea, a population is subdivided into multiple sub-populations as in island models where subdivided training data are rotated over the sub-populations. In this paper, we focus on the effect of training data rotation on the generalization ability and the computation time of our hybrid fuzzy GBML algorithm. First we show parallel distributed implementation of our hybrid fuzzy GBML algorithm. Then we examine the effect of training data rotation through computational experiments where both single-population (i.e., non-parallel) and multi-population (i.e., parallel) versions of our GBML algorithm are applied to a multi-class high-dimensional problem with a large number of training patterns. Experimental results show that training data rotation improves the generalization ability of our GBML algorithm. It is also shown that the population size is more directly related to the computation time than the training data set size.