A hierarchical classifier with growing neural gas clustering

  • Authors:
  • Igor T. Podolak;Kamil Bartocha

  • Affiliations:
  • Institute of Computer Science, Faculty of Mathematics and Computer Science, Jagiellonian University, Kraków, Poland;Institute of Computer Science, Faculty of Mathematics and Computer Science, Jagiellonian University, Kraków, Poland

  • Venue:
  • ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel architecture for a hierarchical classifier (HC) is defined. The objective is to combine several weak classifiers to form a strong one, but a different approach from those known, e.g. AdaBoost, is taken: the training set is split on the basis of previous classifier misclassification between output classes. The problem is split into overlapping subproblems, each classifying into a different set of output classes. This allows for a task size reduction as each sub-problem is smaller in the sense of lower number of output classes, and for higher accuracy. The HC proposes a different approach to the boosting approach. The groups of output classes overlap, thus examples from a single class may end up in several subproblems. It is shown, that this approach ensures that such hierarchical classifier achieves better accuracy. A notion of generalized accuracy is introduced. The sub-problems generation is simple as it is performed with a clustering algorithm operating on classifier outputs. We propose to use the Growing Neural Gas [1] algorithm, because of its good adaptiveness.