Scaling Genetic Programming to Large Datasets Using Hierarchical Dynamic Subset Selection

  • Authors:
  • R. . Curry;P. . Lichodzijewski;M. I. Heywood

  • Affiliations:
  • Dalhousie Univ., Halifax;-;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The computational overhead of genetic programming (GP) may be directly addressed without recourse to hardware solutions using active learning algorithms based on the random or dynamic subset selection heuristics (RSS or DSS). This correspondence begins by presenting a family of hierarchical DSS algorithms: RSS-DSS, cascaded RSS-DSS, and the balanced block DSS algorithm, where the latter has not been previously introduced. Extensive benchmarking over four unbalanced real-world binary classification problems with 30000-500000 training exemplars demonstrates that both the cascade and balanced block algorithms are able to reduce the likelihood of degenerates while providing a significant improvement in classification accuracy relative to the original RSS-DSS algorithm. Moreover, comparison with GP trained without an active learning algorithm indicates that classification performance is not compromised, while training is completed in minutes as opposed to half a day.