Multilayer feedforward networks are universal approximators
Neural Networks
What size net gives valid generalization?
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Vapnik-Chervonenkis dimension of recurent neural networks
Discrete Applied Mathematics - Special issue: Vapnik-Chervonenkis dimension
On the effect of analog noise in discrete-time analog computations
Neural Computation
Learning with Recurrent Neural Networks
Learning with Recurrent Neural Networks
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Generalization Ability of Folding Networks
IEEE Transactions on Knowledge and Data Engineering
Neural Computation
Sample complexity for learning recurrent perceptron mappings
IEEE Transactions on Information Theory
Structural risk minimization over data-dependent hierarchies
IEEE Transactions on Information Theory
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
Recurrent networks for structured data - A unifying approach and its properties
Cognitive Systems Research
Hi-index | 0.00 |
The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where the transition function forms a contraction or the probability of long inputs is restricted. For the general case, we derive posterior bounds which take the input data into account. They are obtained via a generalization of the luckiness framework to the agnostic setting. The general formalism allows to focus on reppresentative parts of the data as well as more general situations such as long term prediction.