Bayesian classifiers based on kernel density estimation: Flexible classifiers

  • Authors:
  • Aritz Pérez;Pedro Larraòaga;Iòaki Inza

  • Affiliations:
  • Intelligent Systems Group, Department of Computer Science and Artificial Intelligence, University of The Basque Country, Spain;Intelligent Systems Group, Department of Computer Science and Artificial Intelligence, University of The Basque Country, Spain;Intelligent Systems Group, Department of Computer Science and Artificial Intelligence, University of The Basque Country, Spain

  • Venue:
  • International Journal of Approximate Reasoning
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

When learning Bayesian network based classifiers continuous variables are usually handled by discretization, or assumed that they follow a Gaussian distribution. This work introduces the kernel based Bayesian network paradigm for supervised classification. This paradigm is a Bayesian network which estimates the true density of the continuous variables using kernels. Besides, tree-augmented naive Bayes, k-dependence Bayesian classifier and complete graph classifier are adapted to the novel kernel based Bayesian network paradigm. Moreover, the strong consistency properties of the presented classifiers are proved and an estimator of the mutual information based on kernels is presented. The classifiers presented in this work can be seen as the natural extension of the flexible naive Bayes classifier proposed by John and Langley [G.H. John, P. Langley, Estimating continuous distributions in Bayesian classifiers, in: Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, 1995, pp. 338-345], breaking with its strong independence assumption. Flexible tree-augmented naive Bayes seems to have superior behavior for supervised classification among the flexible classifiers. Besides, flexible classifiers presented have obtained competitive errors compared with the state-of-the-art classifiers.