Uniqueness of the Gaussian Kernel for Scale-Space Filtering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering by Scale-Space Filtering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Scale-Space Theory in Computer Vision
Scale-Space Theory in Computer Vision
Gaussian Scale-Space Theory
IEEE Computational Science & Engineering
Scale-Space Based Weak Regressors for Boosting
ECML '07 Proceedings of the 18th European conference on Machine Learning
Hi-index | 0.00 |
Various forms of additive modeling techniques have been popularly used in many pattern recognition and machine learning related applications. The efficiency of any additive modeling technique relies significantly on the choice of the weak learner and the form of the loss function. In this paper, we propose a novel scale-space kernel based approach for additive modeling. Our method applies a few insights from the well-studied scale-space theory for choosing the optimal learners during different iterations of the boosting algorithms, which are simple yet powerful additive modeling methods. For each iteration of the additive modeling, weak learners that can best fit the current resolution of the data are chosen and then increase the resolution systematically. We demonstrate the results of the proposed framework on both synthetic and real datasets taken from the UCI machine learning repository. Though demonstrated specifically in the context of boosting algorithms, our approach is generic enough to be accommodated in general additive modeling techniques. Similarities and distinctions of the proposed algorithm with the popularly used radial basis function networks and wavelet decomposition method are also discussed.