Exploiting generative models in discriminative classifiers
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Kernel independent component analysis
The Journal of Machine Learning Research
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Histograms of Oriented Gradients for Human Detection
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
A Sparse Texture Representation Using Local Affine Regions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Beyond Bags of Features: Spatial Pyramid Matching for Recognizing Natural Scene Categories
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
International Journal of Computer Vision
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
More efficiency in multiple kernel learning
Proceedings of the 24th international conference on Machine learning
Multiclass multiple kernel learning
Proceedings of the 24th international conference on Machine learning
Representing shape with a spatial pyramid kernel
Proceedings of the 6th ACM international conference on Image and video retrieval
Image categorization via robust pLSA
Pattern Recognition Letters
Boosted kernel for image categorization
Multimedia Tools and Applications
Hi-index | 0.00 |
In order to achieve good performance in object classification problems, it is necessary to combine information from various image features. Because the large margin classifiers are constructed based on similarity measures between samples called kernels, finding appropriate feature combinations boils down to designing good kernels among a set of candidates, for example, positive mixtures of predetermined base kernels. There are a couple of ways to determine the mixing weights of multiple kernels: (a) uniform weights, (b) a brute force search over a validation set and (c) multiple kernel learning (MKL). MKL is theoretically and technically very attractive, because it learns the kernel weights and the classifier simultaneously based on the margin criterion. However, we often observe that the support vector machine (SVM) with the average kernel works at least as good as MKL. In this paper, we propose as an alternative, a two-step approach: at first, the kernel weights are determined by optimizing the kernel-target alignment score and then the combined kernel is used by the standard SVM with a single kernel. The experimental results with the VOC 2008 data set [8] show that our simple procedure outperforms the average kernel and MKL.