Sequential multi-layers covering neural network

  • Authors:
  • Wang Renwu;Yang Hongshan;Chen Jiaxun

  • Affiliations:
  • College of Information Sciences and Technology, DongHua University, Shanghai, P. R. China and Business school of East China Normal University, Shanghai, P. R. China;College of Information Sciences and Technology, DongHua University, Shanghai, P. R. China;College of Information Sciences and Technology, DongHua University, Shanghai, P. R. China

  • Venue:
  • ISP'06 Proceedings of the 5th WSEAS International Conference on Information Security and Privacy
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new architecture of neural networks has been preliminarily proved in this paper for the Sequential Multi-Layers covering Neural Network (SMNN). In SMNN, the training samples are separated by hidden neurons until the original same class training sets are empty. In the paper, it is proved that the target sets could be linearly separated in polynomial time. Some algorithms have been proposed in light of the ideas. We have used our algorithms with good results, e.g. voice classic. We present only a couple of examples: tow-dimension spiral line, which is difficult for BP network, and real data set.