Matrix analysis
Lower bounds on the Vapnik-Chervonenkis dimension of multi-layer threshold networks
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Vapnik-Chervonenkis dimension of neural networks
The handbook of brain theory and neural networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Neural networks with local receptive fields and superlinear VC Dimension
Neural Computation
Hi-index | 0.01 |
Generalized radial basis function (RBF) neurons are extensions of the RBF neuron model where the Euclidean norm is replaced by a weighted norm. We study binary-valued variants of generalized RBF neurons and compare their computational power in the Boolean domain with linear threshold neurons. As one of the main results, we show that generalized binary RBF neurons with any weighted norm can compute every Boolean function that is computed by a linear threshold neuron. While this inclusion turns into an equality if the RBF neuron uses the Euclidean norm, we exhibit a weighted norm where the inclusion is proper. Applications of the results yield bounds on the Vapnik-Chervonenkis (VC) dimension of RBF neural networks with binary inputs.