Construction of decision trees by using feature importance value for improved learning performance

  • Authors:
  • Md. Ridwan Al Iqbal;Mohammad Saiedur Rahaman;Syed Irfan Nabil

  • Affiliations:
  • Department of Computer Science, American International University-Bangladesh (AIUB), Banani, Dhaka, Bangladesh;Department of Computer Science, American International University-Bangladesh (AIUB), Banani, Dhaka, Bangladesh;Department of Computer Science, American International University-Bangladesh (AIUB), Banani, Dhaka, Bangladesh

  • Venue:
  • ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Decision Tree algorithms cannot learn accurately with a small training set. This is because, decision tree algorithms recursively partition the data set that leaves very few instances in the lower levels of the tree. Additional domain knowledge has been shown to enhance the performance of learners. We present an algorithm named Importance Aided Decision Tree (IADT) that takes Feature Importance as an additional domain knowledge. Decision Tree algorithm always finds the most important attributes in each node. Thus, Feature Importance can be useful to Decision Tree learning. Our algorithm uses a novel approach to incorporate this feature importance score into decision tree learning. This approach makes decision trees more accurate and robust. We demonstrated theoretical and empirical performance analysis to show that IADT is superior to standard decision tree learning algorithms.