Neural network architecture selection: can function complexity help?
Neural Processing Letters
Neural network architecture selection: size depends on function complexity
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Optimal synthesis of boolean functions by threshold functions
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Extension of the generalization complexity measure to real valued input data sets
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Data discretization using the extreme learning machine neural network
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part IV
Hi-index | 0.00 |
In this paper, we analyze Boolean functions using a recently proposed measure of their complexity. This complexity measure, motivated by the aim of relating the complexity of the functions with the generalization ability that can be obtained when the functions are implemented in feed-forward neural networks, is the sum of a number of components. We concentrate on the case in which we use the first two of these components. The first is related to the "average sensitivity" of the function and the second is, in a sense, a measure of the "randomness" or lack of structure of the function. In this paper, we investigate the importance of using the second term in the complexity measure, and we consider to what extent these two terms suffice as an indicator of how difficult it is to learn a Boolean function. We also explore the existence of very complex Boolean functions, considering, in particular, the symmetric Boolean functions