Self-organizing neural grove: effective multiple classifier system with pruned self-generating neural trees

  • Authors:
  • Hirotaka Inoue

  • Affiliations:
  • Department of Electrical Eng. & Information Science, Kure National College of Technology, Kure-shi, Hiroshima, Japan

  • Venue:
  • ICCOMP'06 Proceedings of the 10th WSEAS international conference on Computers
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multiple classifier systems (MCS) have become popular during the last decade. Self-generating neural tree (SGNT) is one of the suitable base-classifiers for MCS because of the simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNT. In an earlier paper, we proposed a pruning method for the structure of the SGNT in the MCS to reduce the computation cost. In this paper, we propose a novel pruning method for effective processing and we call this model as self-organizing neural grove (SONG). The pruning method is constructed from an on-line pruning method and a off-line pruning method. We implement the SONG with two sampling methods. Experiments have been conducted to compare the SONG with an unpruned MCS based on SGNT, the MCS based on C4.5, and k-nearest neighbor method. The results show that the SONG can improve its classification accuracy as well as reducing the computation cost.