Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Multi-layer perceptrons with B-spline receptive field functions
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Enhancing MLP networks using a distributed data representation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Multi-layer Perceptron (MLP) networks function as hyperplane classifiers when applied to classification problems. Therefore, MLP networks can be inefficient when applied to problems in which class boundaries are inadequately modeled by hyperplanes. Attempts to remedy this problem typically necessitate the introduction of a new neural network model in which alternative node connection functions are used to allow the formation of nonlinear class boundaries. In this paper we demonstrate the use of a biologically motivated data representation scheme which, while working within the constraints of the MLP model, permits the development of nonlinear class boundaries. The enhancements afforded by the data representation scheme are demonstrated and analyzed in the context of a classification problem.