Polynomial bounds for VC dimension of sigmoidal neural networks
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
A composition theorem for learning algorithms with applications to geometric concept classes
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
On the effect of analog noise in discrete-time analog computations
Neural Computation
Localization vs. Identification of Semi-Algebraic Sets
Machine Learning
Exact and approximate aggregation in constraint query languages
PODS '99 Proceedings of the eighteenth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
VC-Dimension Analysis of Object Recognition Tasks
Journal of Mathematical Imaging and Vision
On the Sample Complexity for Nonoverlapping Neural Networks
Machine Learning
Nonparametric Time Series Prediction Through Adaptive ModelSelection
Machine Learning
Learning from Approximate Data
COCOON '00 Proceedings of the 6th Annual International Conference on Computing and Combinatorics
On the Sample Complexity for Neural Trees
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
Learning Real Polynomials with a Turing Machine
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
Learning from rounded-off data
Information and Computation
Almost Linear VC-Dimension Bounds for Piecewise Polynomial Networks
Neural Computation
On the generalization error of fixed combinations of classifiers
Journal of Computer and System Sciences
Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Finiteness Properties of Some Families of GP-Trees
Current Topics in Artificial Intelligence
VC Dimension Bounds for Analytic Algebraic Computations
COCOON '08 Proceedings of the 14th annual international conference on Computing and Combinatorics
VCD Bounds for some GP Genotypes
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
MUSP'09 Proceedings of the 9th WSEAS international conference on Multimedia systems & signal processing
Designing neural networks for tackling hard classification problems
WSEAS TRANSACTIONS on SYSTEMS
Annals of Mathematics and Artificial Intelligence
Estimating the size of neural networks from the number of available training data
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Indexability, concentration, and VC theory
Proceedings of the Third International Conference on SImilarity Search and APplications
Proceedings of the Fourth International Conference on SImilarity Search and APplications
PEGASUS: a policy search method for large MDPs and POMDPs
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Indexability, concentration, and VC theory
Journal of Discrete Algorithms
Hi-index | 0.00 |
The Vapnik-Chervonenkis (V-C) dimension is an important combinatorial tool in the analysis of learning problems in the PAC framework. For polynomial learnability, we seek upper bounds on the V-C dimension that are polynomial in the syntactic complexity of concepts. Such upper bounds are automatic for discrete concept classes, but hitherto little has been known about what general conditions guarantee polynomial bounds on V-C dimension for classes in which concepts and examples are represented by tuples of real numbers. In this paper, we show that for two general kinds of concept class the V-C dimension is polynomially bounded in the number of real numbers used to define a problem instance. One is classes where the criterion for membership of an instance in a concept can be expressed as a formula (in the first-order theory of the reals) with fixed quantification depth and exponentially-bounded length, whose atomic predicates are polynomial inequalities of exponentially-bounded degree, The other is classes where containment of an instance in a concept is testable in polynomial time, assuming we may compute standard arithmetic operations on reals exactly in constant time.Our results show that in the continuous case, as in the discrete, the real barrier to efficient learning in the Occam sense is complexity-theoretic and not information-theoretic. We present examples to show how these results apply to concept classes defined by geometrical figures and neural nets, and derive polynomial bounds on the V-C dimension for these classes.