Some examples for solving systems of algebraic equations by calculating groebner bases
Journal of Symbolic Computation
Gro¨bner bases: a computational approach to commutative algebra
Gro¨bner bases: a computational approach to commutative algebra
On-line learning of smooth functions of a single variable
Theoretical Computer Science
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Distance--Based Classification with Lipschitz Functions
The Journal of Machine Learning Research
Maximal margin classification for metric spaces
Journal of Computer and System Sciences - Special issue: Learning theory 2003
Feature space perspectives for learning the kernel
Machine Learning
The Journal of Machine Learning Research
Refinement of Reproducing Kernels
The Journal of Machine Learning Research
When Is There a Representer Theorem? Vector Versus Matrix Regularizers
The Journal of Machine Learning Research
Reproducing Kernel Banach Spaces for Machine Learning
The Journal of Machine Learning Research
Just relax: convex programming methods for identifying sparse signals in noise
IEEE Transactions on Information Theory
Vector-valued reproducing kernel Banach spaces with applications to multi-task learning
Journal of Complexity
Hi-index | 0.00 |
We view regularized learning of a function in a Banach space from its finite samples as an optimization problem. Within the framework of reproducing kernel Banach spaces, we prove the representer theorem for the minimizer of regularized learning schemes with a general loss function and a nondecreasing regularizer. When the loss function and the regularizer are differentiable, a characterization equation for the minimizer is also established.