The entropy of relations and a new approach for decision tree learning

  • Authors:
  • Dan Hu;HongXing Li

  • Affiliations:
  • Department of Mathematics, Beijing Normal University, Beijing, China;Department of Mathematics, Beijing Normal University, Beijing, China

  • Venue:
  • FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The formula for scaling how much information in relations on the finite universe is proposed, which is called the entropy of relation R and denoted by H (R). Based on the concept of H (R), the entropy of predicates and the information of propositions are measured. We can use these measures to evaluate predicates and choose the most appropriate predicate for some given cartesian set. At last, H (R) is used to induce decision tree. The experiment show that the new induction algorithm denoted by IDIR do better than ID3 on the aspects of nodes and test time.