Ten lectures on wavelets
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Compactly supported tight affine spline frames in L2Rd
Mathematics of Computation
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
A tutorial on support vector regression
Statistics and Computing
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
A Novel Kernel Method for Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-parametric regression with wavelet kernels: Research Articles
Applied Stochastic Models in Business and Industry - Statistical Learning
Wavelet kernel penalized estimation for non-equispaced design regression
Statistics and Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Knowledge and Data Engineering
Frames, Reproducing Kernels, Regularization and Learning
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
A kernel path algorithm for support vector machines
Proceedings of the 24th international conference on Machine learning
On a new class of framelet kernels for support vector regression and regularization networks
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Symmetric interplatory framelets and their erasure recovery properties
MRCS'06 Proceedings of the 2006 international conference on Multimedia Content Representation, Classification and Security
Wavelet support vector machine
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Kernel machine-based one-parameter regularized Fisher discriminant method for face recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Image denoising using a tight frame
IEEE Transactions on Image Processing
Kernel Regression for Image Processing and Reconstruction
IEEE Transactions on Image Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Nonlinear Knowledge in Kernel Approximation
IEEE Transactions on Neural Networks
Temporal gene expression profiles reconstruction by support vector regression and framelet kernel
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part II
Least squares regression with l1 -regularizer in sum space
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
Support vector regression and regularization networks are kernel-based techniques for solving the regression problem of recovering the unknown function from sample data. The choice of the kernel function, which determines the mapping between the input space and the feature space, is of crucial importance to such learning machines. Estimating the irregular function with a multiscale structure that comprises both the steep variations and the smooth variations is a hard problem. The result achieved by the traditional Gaussian kernel is often unsatisfactory, because it cannot simultaneously avoid underfitting and overfitting. In this paper, we present a new class of kernel functions derived from the framelet system. A framelet is a tight wavelet frame constructed via multiresolution analysis and has the merit of both wavelets and frames. The construction and approximation properties of framelets have been well studied. Our goal is to combine the power of framelet representation with the merit of kernel methods on learning from sparse data. The proposed framelet kernel has the ability to approximate functions with a multiscale structure and can reduce the influence of noise in data. Experiments on both simulated and real data illustrate the usefulness of the new kernels.