The statistical analysis of compositional data
The statistical analysis of compositional data
High-resolution landform classification using fuzzy k-means
Fuzzy Sets and Systems - Special issue on Uncertainty in geographic information systems and spatial data
Fuzzy classification trees for data analysis
Fuzzy Sets and Systems
General transitivity conditions for fuzzy reciprocal preference matrices
Fuzzy Sets and Systems - Special issue: Preference modelling and applications
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Probability Estimates for Multi-class Classification by Pairwise Coupling
The Journal of Machine Learning Research
A Fast Dual Algorithm for Kernel Logistic Regression
Machine Learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Robust fuzzy relational classifier incorporating the soft class labels
Pattern Recognition Letters
Statistical models for partial membership
Proceedings of the 25th international conference on Machine learning
VC Theory of Large Margin Multi-Category Classifiers
The Journal of Machine Learning Research
Learning valued preference structures for solving classification problems
Fuzzy Sets and Systems
Label ranking by learning pairwise preferences
Artificial Intelligence
Evaluating Membership Functions for Fuzzy Discrete SVM
WILF '07 Proceedings of the 7th international workshop on Fuzzy Logic and Applications: Applications of Fuzzy Sets Theory
Mixed Membership Stochastic Blockmodels
The Journal of Machine Learning Research
Bayesian Clustering of Fuzzy Feature Vectors Using a Quasi-Likelihood Approach
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning membership functions in a function-based object recognition system
Journal of Artificial Intelligence Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Transitivity frameworks for reciprocal relations: cycle-transitivity versus FG-transitivity
Fuzzy Sets and Systems
A transitivity analysis of bipartite rankings in pairwise multi-class classification
Information Sciences: an International Journal
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Aggregation of monotone reciprocal relations with application to group decision making
Fuzzy Sets and Systems
A vector-valued support vector machine model for multiclass problem
Information Sciences: an International Journal
Hi-index | 0.20 |
In several application domains such as biology, computer vision, social network analysis and information retrieval, multi-class classification problems arise in which data instances not simply belong to one particular class, but exhibit a partial membership to several classes. Existing machine learning or fuzzy set approaches for representing this type of fuzzy information mainly focus on unsupervised methods. In contrast, we present in this article supervised learning algorithms for classification problems with partial class memberships, where class memberships instead of crisp class labels serve as input for fitting a model to the data. Using kernel logistic regression (KLR) as a baseline method, first a basic one-versus-all approach is proposed, by replacing the binary-coded label vectors with [0,1]-valued class memberships in the likelihood. Subsequently, we use this KLR extension as base classifier to construct one-versus-one decompositions, in which partial class memberships are transformed and estimated in a pairwise manner. Empirical results on synthetic data and a real-world application in bioinformatics confirm that our approach delivers promising results. The one-versus-all method yields the best computational efficiency, while the one-versus-one methods are preferred in terms of predictive performance, especially when the observed class memberships are heavily unbalanced.