Learning Conditional Independence Tree for Ranking

  • Authors:
  • Jiang Su;Harry Zhang

  • Affiliations:
  • University of New Brunswick, Canada;University of New Brunswick, Canada

  • Venue:
  • ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Accurate ranking is desired in many real-world data mining applications. Traditional learning algorithms, however, aim only at high classification accuracy. It has been observed that both traditional decision trees and naive Bayes produce good classification accuracy but poor probability estimates. In this paper, we use a new model, conditional independence tree (CITree), which is a combination of decision tree and naive Bayes and more suitable for ranking and more learnable in practice. We propose a novel algorithm for learning CITree for ranking, and the experiments show that the CITree algorithm outperforms the state-of-the-art decision tree learning algorithm C4.4 and naive Bayes significantly in yielding accurate rankings. Our work provides an effective data mining algorithm for applications in which an accurate ranking is required.