Combining Multiple K-Nearest Neighbor Classifiers for Text Classification by Reducts

  • Authors:
  • Yongguang Bao;Naohiro Ishii

  • Affiliations:
  • -;-

  • Venue:
  • DS '02 Proceedings of the 5th International Conference on Discovery Science
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

The basic k-nearest neighbor classifier works well in text classification. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, or Boosting that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classifiers. In this paper we present a new approach to general multiple reducts based on rough sets theory, in which we apply multiple reducts to improve the performance of the k-nearest neighbor classifier. This paper describes the proposed technique and provides experimental results.