SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Minimum complexity regression estimation with weakly dependent observations
IEEE Transactions on Information Theory - Part 2
Hi-index | 0.00 |
The generalization performance is the main purpose of machine learning theoretical research. The previous bounds describing the generalization ability of Tikhonov regularization algorithm are almost all based on independent and identically distributed (i.i.d.) samples. In this paper we go far beyond this classical framework by establishing the bound on the generalization ability of Tikhonov regularization algorithm with exponentially strongly mixing observations. We then show that Tikhonov regularization algorithm with exponentially strongly mixing observations is consistent.