Divergence function, duality, and convex analysis
Neural Computation
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Eliciting properties of probability distributions
Proceedings of the 9th ACM conference on Electronic commerce
On Divergences and Informations in Statistics and Information Theory
IEEE Transactions on Information Theory
Relative loss bounds for single neurons
IEEE Transactions on Neural Networks
A new understanding of prediction markets via no-regret learning
Proceedings of the 11th ACM conference on Electronic commerce
The Journal of Machine Learning Research
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
Efficient Market Making via Convex Optimization, and a Connection to Online Learning
ACM Transactions on Economics and Computation - Special Issue on Algorithmic Game Theory
Hi-index | 0.01 |
We present tight surrogate regret bounds for the class of proper (i.e., Fisher consistent) losses. The bounds generalise the margin-based bounds due to Bartlett et al. (2006). The proof uses Taylor's theorem and leads to new representations for loss and regret and a simple proof of the integral representation of proper losses. We also present a different formulation of a duality result of Bregman divergences which leads to a simple demonstration of the convexity of composite losses using canonical link functions.