Maximum margin decision surfaces for increased generalisation in evolutionary decision tree learning

  • Authors:
  • Alexandros Agapitos;Michael O'Neill;Anthony Brabazon;Theodoros Theodoridis

  • Affiliations:
  • Financial Mathematics and Computation Research Cluster, Natural Computing Research and Applications Group, University College Dublin, Ireland;Financial Mathematics and Computation Research Cluster, Natural Computing Research and Applications Group, University College Dublin, Ireland;Financial Mathematics and Computation Research Cluster, Natural Computing Research and Applications Group, University College Dublin, Ireland;School of Computer Science and Electronic Engineering, University of Essex, UK

  • Venue:
  • EuroGP'11 Proceedings of the 14th European conference on Genetic programming
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Decision tree learning is one of the most widely used and practical methods for inductive inference. We present a novel method that increases the generalisation of genetically-induced classification trees, which employ linear discriminants as the partitioning function at each internal node. Genetic Programming is employed to search the space of oblique decision trees. At the end of the evolutionary run, a (1+1) Evolution Strategy is used to geometrically optimise the boundaries in the decision space, which are represented by the linear discriminant functions. The evolutionary optimisation concerns maximising the decision-surface margin that is defined to be the smallest distance between the decision-surface and any of the samples. Initial empirical results of the application of our method to a series of datasets from the UCI repository suggest that model generalisation benefits from the margin maximisation, and that the new method is a very competent approach to pattern classification as compared to other learning algorithms.