Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
The learnability of formal concepts
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Semantic complexity of classes of relational queries and query independent data partitioning
PODS '91 Proceedings of the tenth ACM SIGACT-SIGMOD-SIGART symposium on Principles of database systems
Learning time-varying concepts
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Learning with a slowly changing distribution
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
The weighted majority algorithm
Information and Computation
Tracking Drifting Concepts By Minimizing Disagreements
Machine Learning - Special issue on computational learning theory
Predicting {0, 1}-functions on randomly drawn points
Information and Computation
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
Learning to model sequences generated by switching distributions
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
General bounds on the number of examples needed for learning probabilistic concepts
Journal of Computer and System Sciences
Learning changing concepts by exploiting the structure of change
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
On the complexity of learning from drifting distributions
Information and Computation
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning Under Persistent Drift
EuroCOLT '97 Proceedings of the Third European Conference on Computational Learning Theory
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
Comparison of tracking algorithms for single layer thresholdnetworks in the presence of random drift
IEEE Transactions on Signal Processing
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Efficient algorithms for learning functions with bounded variation
Information and Computation
Performance guarantees for hierarchical clustering
Journal of Computer and System Sciences - Special issue on COLT 2002
Collaborative filtering on streaming data with interest-drifting
Intelligent Data Analysis - Knowlegde Discovery from Data Streams
A simple feature extraction for high dimensional image representations
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
New analysis and algorithm for learning with drifting distributions
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
We show that a \frac{c \epsilon^3}{VC dim({\cal F})}bound on the rate of drift of the distribution generating theexamples is sufficient for agnostic learning to relative accuracyε, where c 0 is a constant; this matches aknown necessary condition to within a constant factor. We establisha \frac{c\epsilon^2}{VC dim({\cal F})} sufficient conditionfor the realizable case, also matching a known necessary condition towithin a constant factor. We provide a relatively simple proof of abound of O(\frac{1}{\epsilon^2} (VC dim({\cal F}) +log \frac{1}{\delta})) on the sample complexity of agnosticlearning in a fixed environment.