A hierarchical neural network architecture for classification

  • Authors:
  • Jing Wang;Haibo He;Yuan Cao;Jin Xu;Dongbin Zhao

  • Affiliations:
  • Department of Electrical, Computer and Biomedical Engineering, University of Rhode Island, Kingston, RI;Department of Electrical, Computer and Biomedical Engineering, University of Rhode Island, Kingston, RI;MathWorks, Inc., Natick, MA;Department of Electrical and Computer Engineering, Stevens Institute of Technology, Hoboken, NJ;State Key Laboratory of Management and Control for Complex Systems Institute of Automation, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a hierarchical neural network with cascading architecture is proposed and its application to classification is analyzed. This cascading architecture consists of multiple levels of neural network structure, in which the outputs of the hidden neurons in the higher hierarchical level are treated as an equivalent input data to the input neurons at the lower hierarchical level. The final predictive result is obtained through a modified weighted majority vote scheme. In this way, it is hoped that new patterns could be learned from hidden layers at each level and thus the combination result could significantly improve the learning performance of the whole system. In simulation, a comparison experiment is carried out among our approach and two popular ensemble learning approaches, bagging and AdaBoost. Various simulation results based on synthetic data and real data demonstrate this approach can improve the classification performance.