Communications of the ACM
Multilayer feedforward networks are universal approximators
Neural Networks
Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
The strength of weak learnability
SFCS '89 Proceedings of the 30th Annual Symposium on Foundations of Computer Science
IEEE Transactions on Information Theory
Hi-index | 0.00 |
There are several drawbacks of multilayer neural networks (MLNNs) including the difficulty of determining the number of hidden nodes and their black box nature. We propose a new dynamic construction mechanism for MLNNs to overcome such inherent drawbacks. The main goal of our work is to train a hidden neuron and assemble it to the network dynamically while making the learning error smaller and smaller. In this paper, a hidden neuron carries out the function of a linear classifier which answers yes(Y) or no(N) to whether the input data belongs to the specific class. We call such a linear classifier a Y/N classifier and call the hidden neuron a Y/N neuron. The number of Y/N neurons are determined self-adaptively according to the given learning error and then successfully avoid the overlearning problem. The dynamically constructed MLNN with Y/N neurons is called a Y/N neural network. We prove that a Y/N neural network can always converge to the required solution and illustrate that Y/N neural networks can be applied to very complex classification problems.