Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Bounding sample size with the Vapnik-Chervonenkis dimension
Discrete Applied Mathematics
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
The nature of statistical learning theory
The nature of statistical learning theory
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Model Selection and Error Estimation
Machine Learning
Rademacher penalties and structural risk minimization
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We describe an approach to statistically verifying complex controllers. This approach is based on deriving practical Vapnik-Chervonenkis-style (VC) generalization bounds for binary classifiers with weighted loss. An important case is deriving bounds on the probability of false positive. We show how existing methods to derive bounds on classification error can be extended to derive similar bounds on the probability of false positive, as well as bounds in a decision-theoretic setting that allows tradeoffs between false negatives and false positives. We describe experiments using these bounds in statistically verifying computational properties of an iterative controller for an Organic Air Vehicle (OAV).