The complexity of Boolean functions
The complexity of Boolean functions
What size net gives valid generalization?
Neural Computation
Depth-Size Tradeoffs for Neural Computation
IEEE Transactions on Computers - Special issue on artificial neural networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Feedforward Neural Network Construction Using Cross Validation
Neural Computation
Generalization and Selection of Examples in Feedforward Neural Networks
Neural Computation
Generalization properties of modular networks: implementing the parity function
IEEE Transactions on Neural Networks
Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling
Expert Systems with Applications: An International Journal
Data processing for effective modeling of circuit behavior
EC'07 Proceedings of the 8th Conference on 8th WSEAS International Conference on Evolutionary Computing - Volume 8
Investigating data preprocessing methods for circuit complexity models
Expert Systems with Applications: An International Journal
Prediction of area and length complexity measures for binary decision diagrams
Expert Systems with Applications: An International Journal
Neural network architecture selection: size depends on function complexity
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
The generalization ability of different sizes architectures with one and two hidden layers trained with backpropagation combined with early stopping have been analyzed. The dependence of the generalization process on the complexity of the function being implemented is studied using a recently introduced measure for the complexity of Boolean functions. For a whole set of Boolean symmetric functions it is found that large neural networks have a better generalization ability on a large complexity range of the functions in comparison to smaller ones and also that the introduction of a small second hidden layer of neurons further improves the generalization ability for very complex functions. Quasi-random generated Boolean functions were also analyzed and we found that in this case the generalization ability shows small variability across different network sizes both with one and two hidden layer network architectures.