Matrix analysis
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Cluster ensembles --- a knowledge reuse framework for combining multiple partitions
The Journal of Machine Learning Research
K-means clustering via principal component analysis
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A survey of kernel and spectral methods for clustering
Pattern Recognition
Unsupervised Object Discovery: A Comparison
International Journal of Computer Vision
Hi-index | 0.00 |
Measuring similarity between objects is a fundamental issue for numerous applications in data-mining and machine learning domains In this paper, we are interested in kernels We particularly focus on kernel normalization methods that aim at designing proximity measures that better fit the definition and the intuition of a similarity index To this end, we introduce a new family of normalization techniques which extends the cosine normalization Our approach aims at refining the cosine measure between vectors in the feature space by considering another geometrical based score which is the mapped vectors' norm ratio We show that the designed normalized kernels satisfy the basic axioms of a similarity index unlike most unnormalized kernels Furthermore, we prove that the proposed normalized kernels are also kernels Finally, we assess these different similarity measures in the context of clustering tasks by using a kernel PCA based clustering approach Our experiments employing several real-world datasets show the potential benefits of normalized kernels over the cosine normalization and the Gaussian RBF kernel.