The covering number in learning theory
Journal of Complexity
Estimating surface normals in noisy point cloud data
Proceedings of the nineteenth annual symposium on Computational geometry
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Leave-one-out bounds for kernel methods
Neural Computation
Shape modeling with point-sampled geometry
ACM SIGGRAPH 2003 Papers
A theoretical characterization of linear SVM-based feature selection
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Some Properties of Regularized Kernel Methods
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Learning Coordinate Covariances via Gradients
The Journal of Machine Learning Research
Estimation of Gradients and Coordinate Covariation in Classification
The Journal of Machine Learning Research
Derivative reproducing properties for kernel methods in learning theory
Journal of Computational and Applied Mathematics
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Hi-index | 7.29 |
The problem of learning from data involving function values and gradients is considered in a framework of least-square regularized regression in reproducing kernel Hilbert spaces. The algorithm is implemented by a linear system with the coefficient matrix involving both block matrices for generating Graph Laplacians and Hessians. The additional data for function gradients improve learning performance of the algorithm. Error analysis is done by means of sampling operators for sample error and integral operators in Sobolev spaces for approximation error.