System identification: theory for the user
System identification: theory for the user
Regularization theory and neural networks architectures
Neural Computation
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Machine Learning
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Hybrid wavelet model construction using orthogonal forward selection with boosting search
International Journal of Business Intelligence and Data Mining
Discriminative fusion of shape and appearance features for human pose estimation
Pattern Recognition
Hi-index | 0.00 |
Estimating a non-uniformly sampled function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise. This article presents a new method to estimate a function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a ℓ1-regularized-multiple-kernel estimator. The general idea is to decompose the function to learn on a sparse-optimal set of spanning functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-sampled functions.