Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
A sufficient condition for polynomial distribution-dependent learnability
Discrete Applied Mathematics
Approximation and learning of convex superpositions
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
Neural Computation
Vapnik-Chervonenkis dimension of recurent neural networks
Discrete Applied Mathematics - Special issue: Vapnik-Chervonenkis dimension
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Generalization of Elman Networks
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Inductive Learning in Symbolic Domains Using Structure-Driven Recurrent Neural Networks
KI '96 Proceedings of the 20th Annual German Conference on Artificial Intelligence: Advances in Artificial Intelligence
Sample complexity for learning recurrent perceptron mappings
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Structural risk minimization over data-dependent hierarchies
IEEE Transactions on Information Theory
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
On the Generalization Ability of Recurrent Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Neural Networks for Adaptive Processing of Structured Data
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Universal Approximation Capability of Cascade Correlation for Structures
Neural Computation
Recurrent networks for structured data - A unifying approach and its properties
Cognitive Systems Research
Hi-index | 0.00 |
The information theoretical learnability of folding networks, a very successful approach capable of dealing with tree structured inputs, is examined. We find bounds on the VC, pseudo-, and fat shattering dimension of folding networks with various activation functions. As a consequence, valid generalization of folding networks can be guaranteed. However, distribution independent bounds on the generalization error cannot exist in principle. We propose two approaches which take the specific distribution into account and allow us to derive explicit bounds on the deviation of the empirical error from the real error of a learning algorithm: The first approach requires the probability of large trees to be limited a priori and the second approach deals with situations where the maximum input height in a concrete learning example is restricted.