What size net gives valid generalization?
Neural Computation
Advances in neural information processing systems 2
Generalization by weight-elimination with application to forecasting
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
From data distributions to regularization in invariant learning
Neural Computation
Structural learning with forgetting
Neural Networks
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Hi-index | 0.00 |
In this article, we present a technique to improve the generalization ability of multilayer neural networks. The proposed method introduces linear constraints on weight representation based on the invariance natures of training targets. We propose a learning method that introduces effective linear constraints into an error function as a penalty term. Furthermore, introduction of such constraints leads to reduction of the VC dimension of neural networks. We show bounds on the VC dimension of the neural networks with such constraints. Finally, we demonstrate the effectiveness of the proposed method by some experiments.