One Dependence Value Difference Metric

  • Authors:
  • Chaoqun Li;Hongwei Li

  • Affiliations:
  • Department of Mathematics, China University of Geosciences, Wuhan, Hubei 430074, China;Department of Mathematics, China University of Geosciences, Wuhan, Hubei 430074, China

  • Venue:
  • Knowledge-Based Systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many distance-related algorithms depend upon a good distance metric to be successful. The Value Difference Metric, simply VDM, is proposed to find reasonable distance metric between each pair of instances with nominal attribute values only. In VDM, all of the attributes are assumed to be fully independent, and the difference between two values of an attribute is only considered to be closer if they have more similar correlation with the output classes. It is obvious that the attribute independence assumption in VDM is rarely true in reality, which would harm its performance in the applications with complex attribute dependencies. In this paper, we single out an improved Value Difference Metric by relaxing its unrealistic attribute independence assumption. We call it One Dependence Value Difference Metric, simply ODVDM. In ODVDM, the structure learning algorithms for Bayesian network classifiers, such as tree augmented naive Bayes, are used to find the dependence relationships among the attributes. Our experimental results validate its effectiveness in terms of classification accuracy.