Combination of multiple nearest neighbor classifiers based on feature subset clustering method

  • Authors:
  • Li-Juan Wang;Qiang Hua;Xiao-Long Wang;Qing-Cai Chen

  • Affiliations:
  • Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China;Machine Learning Center, Faculty of Mathematics and Computer Science, Hebei University, Baoding, China;Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China;Department of Computer Science and Technology, Harbin Institute of Technology, Harbin, China

  • Venue:
  • ICMLC'05 Proceedings of the 4th international conference on Advances in Machine Learning and Cybernetics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a new method called FC-MNNC based on feature subset clustering for combining multiple NNCs to obtain better performance than that of using a single NNC. In FC-MNNC, the component NNCs based on the reasonably partitioned feature subsets are parallel and independently able to classify one pattern and the final decision is aggregated by the majority voting rule. Here, two methods are used to partition the feature set. In method I, GA is used for clustering features to form different feature subsets according to the accuracy of the combination classification. And method II is the transitive closure clustering method based on the pair-wise correlation between features. To demonstrate the performance of FC-MNNC, we select four UCI databases for our experiments. The experimental results show that: (i) in FC-MNNC, the performance of method II isn’t better than that of method I; (ii) the accuracy of FC-MNNC based on method I is better than that of the standard NNC and feature selection using GA in individual classifier; (iii) the performance of FC-MNNC based on method I is not worse than that of feature subset selection using GA in multiple NNCs; and (iv) FC-MNNC is robust against irrelevant features.