Learning time-varying concepts
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Learning with a slowly changing distribution
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Information-based objective functions for active data selection
Neural Computation
Experience with a learning personal assistant
Communications of the ACM
Tracking Drifting Concepts By Minimizing Disagreements
Machine Learning - Special issue on computational learning theory
Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
Machine Learning - Special issue on computational learning theory
Learning in the presence of concept drift and hidden contexts
Machine Learning
On the complexity of learning from drifting distributions
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Learning changing concepts by exploiting the structure of change
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Discovering informative patterns and data cleaning
Advances in knowledge discovery and data mining
Automating the analysis and cataloging of sky surveys
Advances in knowledge discovery and data mining
Selecting and reporting what is interesting
Advances in knowledge discovery and data mining
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Knowledge Discovery in Molecular Databases
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
A statistical learning model is considered within the framework of the theory of uniform convergence of frequencies of errors in the case where the convergence is violated as a result of increasing the informativeness of training examples. Drawbacks of nonconstructive refinements of Vapnik-Chervonenkis estimates based on an assumption on the distribution law of violations are shown. A new approach to obtaining constructive estimates for mass data sets is proposed.