A Validity Measure for Fuzzy Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fuzzy sets and fuzzy logic: theory and applications
Fuzzy sets and fuzzy logic: theory and applications
Two soft relatives of learning vector quantization
Neural Networks
Machine Learning
Wavelet-FILVQ Classifier for Speech Analysis
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
Soft learning vector quantization
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Fuzzy Labeled Soft Nearest Neighbor Classification with Relevance Learning
ICMLA '05 Proceedings of the Fourth International Conference on Machine Learning and Applications
Hybrid Intelligent Systems for Pattern Recognition Using Soft Computing: An Evolutionary Approach for Neural Networks and Fuzzy Systems (Studies in Fuzziness and Soft Computing)
Evaluating classifiers by means of test data with noisy labels
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Fuzzy Classifier Design
Repairs to GLVQ: a new family of competitive learning schemes
IEEE Transactions on Neural Networks
Soft nearest prototype classification
IEEE Transactions on Neural Networks
Fuzzy Gaussian Process Classification Model
ICIAR '09 Proceedings of the 6th International Conference on Image Analysis and Recognition
Pattern classification and clustering: A review of partially supervised learning approaches
Pattern Recognition Letters
Fuzzy nearest neighbor algorithms: Taxonomy, experimental analysis and prospects
Information Sciences: an International Journal
Advanced Engineering Informatics
Hi-index | 0.00 |
Supervised learning models most commonly use crisp labels for classifier training. Crisp labels fail to capture the data characteristics when overlapping classes exist. In this work we attempt to compare between learning using soft and hard labels to train K-nearest neighbor classifiers. We propose a new technique to generate soft labels based on fuzzy-clustering of the data and fuzzy relabelling of cluster prototypes. Experiments were conducted on five data sets to compare between classifiers that learn using different types of soft labels and classifiers that learn with crisp labels. Results reveal that learning with soft labels is more robust against label errors opposed to learning with crisp labels. The proposed technique to find soft labels from the data, was also found to lead to a more robust training in most data sets investigated.