COLT '90 Proceedings of the third annual workshop on Computational learning theory
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A game of prediction with expert advice
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Predictive complexity and information
Journal of Computer and System Sciences - Special issue on COLT 2002
Hi-index | 5.23 |
One of the most efficient way of improving the performance of learning algorithms is "snooping", i.e. using some information about the data to be predicted for choosing the parameters of the learning algorithm or the learning algorithm itself. Allowing different degrees of snooping makes it possible to attain a better performance LossP(x) of a prediction strategy P on the given data set x. We study the "snooping curves" Lx()=infK(P)LossP(x), where K(P) is the Kolmogorov complexity of the prediction strategy P. We prove that every non-increasing function can be approximated with arbitrary precision by some snooping function Lx. Our framework is that of on-line prediction; for simplicity we assume that sequences x are binary.