A programming paradigm for machine learning, with a case study of Bayesian networks

  • Authors:
  • Lloyd Allison

  • Affiliations:
  • Clayton School of Information Technology, Monash University, Victoria, Australia

  • Venue:
  • ACSC '06 Proceedings of the 29th Australasian Computer Science Conference - Volume 48
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Inductive programming is a new machine learning paradigm which combines functional programming for writing statistical models and information theory to prevent overfitting, Type-classes specify general properties that models must have. Many statistical models, estimators and operators have polymorphic types. Useful operators combine models, and estimators, to form new ones; Functional programmings's compositional style of programming is a great advantage in this domain, Complementing this, information theory provides a compositional measure of the complexity of a model from its parts.Inductive programming is illustrated by a case study of Bayesian networks, Networks are built from classification- (decision-) trees. Trees are built from partioning functions and models on data-spaces. Trees, and hence networks, are general as a natural consequence of the method. Discrete and continious variables, and missing values are handled by the networks. Finally the Bayesian networks are applied to a challenging data set on lost persons.