A neural network structure with constant weights to implement convex recursive deletion regions

  • Authors:
  • Che-Chern Lin

  • Affiliations:
  • National Kaohsiung Normal University, Department of Industrial Technology Education, Kaohsiung, Taiwan, R.O.C.

  • Venue:
  • NN'07 Proceedings of the 8th Conference on 8th WSEAS International Conference on Neural Networks - Volume 8
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A previous study has proposed a constructive algorithm to implement convex recursive deletion regions via two-layer perceptrons. However, the absolute values of the weights determined by the constructive algorithm become larger and larger when the number of nested layers of a convex recursive deletion region increases. The absolute values of the weights also depend on the complexity of the structure of the convex recursive deletion region. If the structure of the convex recursive deletion region is very complicated, the absolute values of the weights determined by the constructive algorithm could be very large. Besides, we still need to use the constructive procedure to get the parameters (weights and thresholds) for the neural networks. In this paper, we propose a simple three-layer network structure to implement the convex recursive deletion regions in which all weights of the second and third layers are all 1's and the thresholds for the nodes in the second layer are pre-determined according to the structures of the convex recursive deletion regions. We also provide the activation function for the output node. In brief, all of parameters (weights and activation functions) in the proposed structure are pre-determined and no constructive algorithm is needed for solving the convex recursive deletion region problems. We prove the feasibility of the proposed structure and give an illustrative example to demonstrate how the proposed structure implements the convex recursive deletion regions.