Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Digital Image Processing
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 4 - Volume 4
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Mean Shift Based Clustering in High Dimensions: A Texture Classification Example
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Selective Sampling for Nearest Neighbor Classifiers
Machine Learning
Online and batch learning of pseudo-metrics
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Solving cluster ensemble problems by bipartite graph partitioning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning a Similarity Metric Discriminatively, with Application to Face Verification
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Unsupervised performance evaluation of image segmentation
EURASIP Journal on Applied Signal Processing
Image segmentation evaluation: A survey of unsupervised methods
Computer Vision and Image Understanding
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Explicit learning curves for transduction and application to clustering and compression algorithms
Journal of Artificial Intelligence Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Automated performance evaluation of range image segmentation algorithms
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Large margin nearest neighbor classifiers
IEEE Transactions on Neural Networks
An ensemble-clustering-based distance metric and its applications
International Journal of Business Intelligence and Data Mining
Hi-index | 0.00 |
The performance of the k Nearest Neighbor (kNN) algorithm depends critically on its being given a good metric over the input space. One of its main drawbacks is that kNN uses only the geometric distance to measure the similarity and the dissimilarity between the objects without using any statistical regularities in the data, which could help convey the inter-class distance. We found that objects belonging to the same cluster usually share some common traits even though their geometric distance might be large. We therefore decided to define a metric based on clustering. As there is no optimal clustering algorithm with optimal parameter values, several clustering runs are performed yielding an ensemble of clustering (EC) results. The distance between points is defined by how many times the objects were not clustered together. This distance is then used within the framework of the kNN algorithm (kNN-EC). Moreover, objects which were always clustered together in the same clusters are defined as members of an equivalence class. As a result the algorithm now runs on equivalence classes instead of single objects. In our experiments the number of equivalence classes is usually one tenth to one fourth of the number of objects. This equivalence class representation is in effect a smart data reduction technique which can have a wide range of applications. It is complementary to other data reduction methods such as feature selection and methods for dimensionality reduction such as for example PCA. We compared kNN-EC to the original kNN on standard datasets from different fields, and for segmenting a real color image to foreground and background. Our experiments show that kNN-EC performs better than or comparable to the original kNN over the standard datasets and is superior for the color image segmentation.