Bounds on the number of hidden neurons in multilayer perceptrons
IEEE Transactions on Neural Networks
A Constructive Approach to Calculating Lower Entropy Bounds
Neural Processing Letters
Bounds on the number of hidden neurons in three-layer binary
Neural Networks
International Journal of Systems Science
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Neural network architecture selection: can function complexity help?
Neural Processing Letters
Extension of the generalization complexity measure to real valued input data sets
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.01 |
For three-layer artificial neural networks (TANs) that take binary values, the number of hidden units is considered regarding two problems: One is to find the necessary and sufficient number to make mapping between the binary output values of TANs and learning patterns (inputs) arbitrary, and the other is to get the sufficient number for two-category classification (TCC) problems. We show that for the former I - 1 hidden units are necessary and sufficient for I learning patterns and that for the latter about I/3 hidden units are sufficient. These results mean that we can reduce the necessary number of hidden units by taking into account the features of learning pattern distributions.