Relative Loss Bounds for Multidimensional Regression Problems
Machine Learning
Probability Theory for the Brier Game
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
Tracking the best linear predictor
The Journal of Machine Learning Research
Prediction, Learning, and Games
Prediction, Learning, and Games
Non-asymptotic calibration and resolution
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Defensive prediction with expert advice
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Predictions as statements and decisions
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Competing with wild prediction rules
COLT'06 Proceedings of the 19th annual conference on Learning Theory
On-Line regression competitive with reproducing kernel hilbert spaces
TAMC'06 Proceedings of the Third international conference on Theory and Applications of Models of Computation
Worst-case quadratic loss bounds for prediction using linear functions and gradient descent
IEEE Transactions on Neural Networks
Relative loss bounds for single neurons
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We start from a simple asymptotic result for the problem of on-line regression with the quadratic loss function: the class of continuous limited-memory prediction strategies admits a “leading prediction strategy”, which not only asymptotically performs at least as well as any continuous limited-memory strategy but also satisfies the property that the excess loss of any continuous limited-memory strategy is determined by how closely it imitates the leading strategy. More specifically, for any class of prediction strategies constituting a reproducing kernel Hilbert space we construct a leading strategy, in the sense that the loss of any prediction strategy whose norm is not too large is determined by how closely it imitates the leading strategy. This result is extended to the loss functions given by Bregman divergences and by strictly proper scoring rules.