What size net gives valid generalization?
Neural Computation
An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
On-Line Confidence Machines Are Well-Calibrated
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
Weakly convergent nonparametric forecasting of stationary time series
IEEE Transactions on Information Theory
Pattern Recognition for Conditionally Independent Data
The Journal of Machine Learning Research
Hi-index | 0.00 |
In this work we consider the task of relaxing the i.i.d assumption in online pattern recognition (or classification), aiming to make existing learning algorithms applicable to a wider range of tasks. Online pattern recognition is predicting a sequence of labels based on objects given for each label and on examples (pairs of objects and labels) learned so far. Traditionally, this task is considered under the assumption that examples are independent and identically distributed. However, it turns out that many results of pattern recognition theory carry over under a much weaker assumption. Namely, under the assumption of conditional independence and identical distribution of objects only, while the only condition on the distribution of labels is that the rate of occurrence of each label should be above some positive threshold.We find a broad class of learning algorithms for which estimations of the probability of a classification error achieved under the classical i.i.d. assumption can be generalised to the similar estimates for the case of conditionally i.i.d. distributed examples.