Predicate selection for structural decision trees

  • Authors:
  • K. S. Ng;J. W. Lloyd

  • Affiliations:
  • Computer Sciences Laboratory, Research School of Information Sciences and Engineering, The Australian National University;Computer Sciences Laboratory, Research School of Information Sciences and Engineering, The Australian National University

  • Venue:
  • ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study predicate selection functions (also known as splitting rules) for structural decision trees and propose two improvements to existing schemes. The first is in classification learning, where we reconsider the use of accuracy as a predicate selection function and show that, on practical grounds, it is a better alternative to other commonly used functions. The second is in regression learning, where we consider the standard mean squared error measure and give a predicate pruning result for it.