Applying SP-MLP to complex classification problems
Pattern Recognition Letters
The constraint based decomposition (CBD) training architecture
Neural Networks
Classification ability of single hidden layer feedforward neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A two-layer paradigm capable of forming arbitrary decision regions in input space
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A previous study has proposed a constructive algorithm to implement convex recursive deletion regions via two-layer perceptrons. However, the absolute values of the weights determined by the constructive algorithm become larger and larger when the number of nested layers of a convex recursive deletion region increases. The absolute values of the weights also depend on the complexity of the structure of the convex recursive deletion region. If the structure of the convex recursive deletion region is very complicated, the absolute values of the weights determined by the constructive algorithm could be very large. Besides, we still need to use the constructive procedure to get the parameters (weights and thresholds) for the neural networks. In this paper, we propose a simple three-layer network structure to implement the convex recursive deletion regions in which all weights of the second and third layers are all 1's and the thresholds for the nodes in the second layer are pre-determined according to the structures of the convex recursive deletion regions. We also provide the activation function for the output node. In brief, all of parameters (weights and activation functions) in the proposed structure are pre-determined and no constructive algorithm is needed for solving the convex recursive deletion region problems. We prove the feasibility of the proposed structure and give an illustrative example to demonstrate how the proposed structure implements the convex recursive deletion regions.