Evolutionary induction of sparse neural trees

  • Authors:
  • Byoung-Tak Zhang;Peter Ohm;Heinz Mühlenbein

  • Affiliations:
  • Department of Computer Engineering Seoul National University Seoul 151-742, Korea btzhang@comp.snu.ac.kr;German National Research Center for Information Technology (GMD) D-53754 St. Augustin, Germany peter.ohm@gmd.de;German National Research Center for Information Technology (GMD) D-53754 St. Augustin, Germany heinz.muehlenbein@gmd.de

  • Venue:
  • Evolutionary Computation
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper is concerned with the automatic induction of parsimonious neural networks. In contrast to other program induction situations, network induction entails parametric learning as well as structural adaptation. We present a novel representation scheme called neural trees that allows efficient learning of both network architectures and parameters by genetic search. A hybrid evolutionary method is developed for neural tree induction that combines genetic programming and the breeder genetic algorithm under the unified framework of the minimum description length principle. The method is successfully applied to the induction of higher order neural trees while still keeping the resulting structures sparse to ensure good generalization performance. Empirical results are provided on two chaotic time series prediction problems of practical interest.