A balanced neural tree for pattern classification

  • Authors:
  • Christian Micheloni;Asha Rani;Sanjeev Kumar;Gian Luca Foresti

  • Affiliations:
  • AVIRES Lab, Department of Mathematics and Computer Science, University of Udine, Via Della Scienze-206, Udine, Italy;AVIRES Lab, Department of Mathematics and Computer Science, University of Udine, Via Della Scienze-206, Udine, Italy;Department of Mathematics, Indian Institute of Technology Roorkee, Roorkee-247667, India;AVIRES Lab, Department of Mathematics and Computer Science, University of Udine, Via Della Scienze-206, Udine, Italy

  • Venue:
  • Neural Networks
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a new neural tree (NT) architecture, balanced neural tree (BNT), to reduce tree size and improve classification with respect to classical NTs. To achieve this result, two main innovations have been introduced: (a) perceptron substitution and (b) pattern removal. The first innovation aims to balance the structure of the tree. If the last-trained perceptron largely misclassifies the given training set into a reduced number of classes, then this perceptron is substituted with a new perceptron. The second novelty consists of the introduction of a new criterion for the removal of tough training patterns that generate the problem of over-fitting. Finally, a new error function based on the depth of the tree is introduced to reduce perceptron training time. The proposed BNT has been tested on various synthetic and real datasets. The experimental results show that the proposed BNT leads to satisfactory results in terms of both tree depth reduction and classification accuracy.