COLT '90 Proceedings of the third annual workshop on Computational learning theory
The weighted majority algorithm
Information and Computation
Combining different procedures for adaptive regression
Journal of Multivariate Analysis
Recursive Aggregation of Estimators by the Mirror Descent Algorithm with Averaging
Problems of Information Transmission
Prediction, Learning, and Games
Prediction, Learning, and Games
Aggregation and sparsity via ℓ1 penalized least squares
COLT'06 Proceedings of the 19th annual conference on Learning Theory
A randomized online learning algorithm for better variance control
COLT'06 Proceedings of the 19th annual conference on Learning Theory
On the generalization ability of on-line learning algorithms
IEEE Transactions on Information Theory
Stable recovery of sparse overcomplete representations in the presence of noise
IEEE Transactions on Information Theory
Information-theoretic upper and lower bounds for statistical estimation
IEEE Transactions on Information Theory
Information Theory and Mixing Least-Squares Regressions
IEEE Transactions on Information Theory
NL-Means and aggregation procedures
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Hyper-Sparse Optimal Aggregation
The Journal of Machine Learning Research
Non-local Methods with Shape-Adaptive Patches (NLM-SAP)
Journal of Mathematical Imaging and Vision
Sparse regression learning by aggregation and Langevin Monte-Carlo
Journal of Computer and System Sciences
Sparsity regret bounds for individual sequences in online linear regression
The Journal of Machine Learning Research
Hi-index | 0.00 |
In the present paper, we study the problem of aggregation under the squared loss in the model of regression with deterministic design. We obtain sharp oracle inequalities for convex aggregates defined via exponential weights, under general assumptions on the distribution of errors and on the functions to aggregate. We show how these results can be applied to derive a sparsity oracle inequality.