A comparison of soft fusion methods under different bagging scenarios

  • Authors:
  • Fuad Alkoot;Hussain Qasem

  • Affiliations:
  • -;-

  • Venue:
  • CIMMACS'07 Proceedings of the 6th WSEAS international conference on Computational intelligence, man-machine systems and cybernetics
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We experiment with different fusion methods when bagging k-NN classifiers under various conditions. Experiments with four types of bagging are made at four training set sizes, using two metrics. The aim is to find the conditions for an optimum bagging performance. Additionally we aim to find the best rule under the specified conditions. We compare the performance of the different fusion strategies under each condition. Fusion methods used are Sum, Modified Product (MProduct) [2], Vote and Moderation [1]. Results show that the performance depends on the data used, number of nearest neighbors (k), number of fused classifiers and size of training set. Over all the three rules derived from Product show a close performance, while Vote shows an opposite performance. Among the three rules Moderation either follows Sum or MProduct. Results indicate MProduct outperforms Sum at many instances. At some of these instances Sum did not outperform the single classifier while MProduct did. Moderation is the second best, while Vote is inferior especially at even numbers of k. This is an inherited weakness of Vote, where at even number of close samples ties are randomly resolved. At k=1 all rules yield similar results. There are few instances where Moderation outperforms all. In general MProduct is the best choice.