Exploiting generative models in discriminative classifiers
Proceedings of the 1998 conference on Advances in neural information processing systems II
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Diffusion Kernels on Graphs and Other Discrete Input Spaces
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
The Journal of Machine Learning Research
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
The Journal of Machine Learning Research
Efficiently matching sets of features with random histograms
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Common vector approach and its combination with GMM for text-independent speaker recognition
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
In this paper, we present a new kernel for unordered sets of data of the same type. It works by first fitting a set with a Gaussian mixture, then evaluate an efficient kernel on the two fitted Gaussian mixtures. Furthermore, we show that this kernel can be extended to sets embedded in a feature space implicitly defined by another kernel, where Gaussian mixtures are fitted with the kernelized EM algorithm [6], and the kernel for Gaussian mixtures are modified to use the outputs from the kernelized EM. All computation depends on data only through their inner products as evaluations of the base kernel. The kernel is computable in closed form, and being able to work in a feature space improves its flexibility and applicability. Its performance is evaluated in experiments on both synthesized and real data.