Classifier Instability and Partitioning

  • Authors:
  • Terry Windeatt

  • Affiliations:
  • -

  • Venue:
  • MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Various methods exist for reducing correlation between classifiers in a multiple classifier framework. The expectation is that the composite classifier will exhibit improved performance and/or be simpler to automate compared with a single classifier. In this paper we investigate how generalisation is affected by varying complexity of unstable base classifiers, implemented as identical single hidden layer MLP networks with fixed parameters. A technique that uses recursive partitioning for selectively perturbing the training set is also introduced, and shown to improve performance and reduce sensitivity to base classifier complexity. Benchmark experiments include artificial and real data with optimal error rates greater than eighteen percent.