Communications of the ACM
Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
A general lower bound on the number of examples needed for learning
Information and Computation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Computational learning theory: an introduction
Computational learning theory: an introduction
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
An introduction to computational learning theory
An introduction to computational learning theory
More theorems about scale-sensitive dimensions and learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A sharp concentration inequality with application
Random Structures & Algorithms
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
A Theory of Learning and Generalization
A Theory of Learning and Generalization
Some Local Measures of Complexity of Convex Hulls and Generalization Bounds
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Localized Rademacher Complexities
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A few notes on statistical learning theory
Advanced lectures on machine learning
Data-dependent margin-based generalization bounds for classification
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
IEEE Transactions on Information Theory
Hi-index | 0.04 |
This paper discusses the applications of certain combinatorial and probabilistic techniques to the analysis of machine learning. Probabilistic models of learning initially addressed binary classification (or pattern classification). Subsequently, analysis was extended to regression problems, and to classification problems in which the classification is achieved by using real-valued functions (where the concept of a large margin has proven useful). Another development, important in obtaining more applicable models, has been the derivation of data-dependent bounds. Here, we discuss some of the key probabilistic and combinatorial techniques and results, focusing on those of most relevance to researchers in discrete applied mathematics.