Active Learning with Automatic Soft Labeling for Induction of Decision Trees

  • Authors:
  • Jiang Su;Jelber Sayyad Shirabad;Stan Matwin;Jin Huang

  • Affiliations:
  • SITE, University of Ottawa, Ottawa, Canada K1N 6N5;SITE, University of Ottawa, Ottawa, Canada K1N 6N5;SITE, University of Ottawa, Ottawa, Canada K1N 6N5 and Institute of Computer Science, Polish Academy of Sciences, Warsaw, Poland;SITE, University of Ottawa, Ottawa, Canada K1N 6N5

  • Venue:
  • Canadian AI '09 Proceedings of the 22nd Canadian Conference on Artificial Intelligence: Advances in Artificial Intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Decision trees have been widely used in many data mining applications due to their interpretable representation. However, learning an accurate decision tree model often requires a large amount of labeled training data. Labeling data is costly and time consuming. In this paper, we study learning decision trees with lesser labeling cost from two perspectives: data quality and data quantity. At each step of active learning process we learn a random forest and then use it to label a large quantity of unlabeled data. To overcome the large tree size caused by the machine labeling, we generate weighted (soft) labeled data using the prediction confidence of the labeling classifier. Empirical studies show that our method can significantly improve active learning in terms of labeling cost for decision tree learning, and the improvement does not sacrifice the size of decision trees.