Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Machine Learning
Fril- Fuzzy and Evidential Reasoning in Artificial Intelligence
Fril- Fuzzy and Evidential Reasoning in Artificial Intelligence
C4.5: Programs for Machine Learning
C4.5: Programs for Machine Learning
Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
A complete fuzzy decision tree technique
Fuzzy Sets and Systems - Theme: Learning and modeling
A framework for linguistic modelling
Artificial Intelligence
Fuzzy decision trees: issues and methods
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Fuzzy logic = computing with words
IEEE Transactions on Fuzzy Systems
A new adaptive center weighted median filter for suppressing impulsive noise in images
Information Sciences: an International Journal
Fuzzy prototype model and semantic distance
Information Systems
Reformulation of the theory of conceptual spaces
Information Sciences: an International Journal
LFOIL: Linguistic rule induction in the label semantics framework
Fuzzy Sets and Systems
Decision trees as possibilistic classifiers
International Journal of Approximate Reasoning
Induction of multiple fuzzy decision trees based on rough set technique
Information Sciences: an International Journal
Linguistic modelling and information coarsening based on prototype theory and label semantics
International Journal of Approximate Reasoning
A linguistic CMAC equivalent to a linguistic decision tree for classification
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Hi-index | 0.00 |
Label semantics is a random set based framework for "Computing with Words" that captures the idea of computation on linguistic terms rather than numerical quantities. Within this new framework, a decision tree learning model is proposed where nodes are linguistic descriptions of variables and leaves are sets of appropriate labels. In such decision trees, the probability estimates for branches across the whole tree is used for classification, instead of the majority class of the single branch into which the examples fall. By empirical experiments on real-world datasets it is verified that our algorithm has better or equivalent classification accuracy compared to three well known machine learning algorithms. By applying a new forward branch merging algorithm, the complexity of the tree can be greatly reduced without significant loss of accuracy. Finally, a linguistic interpretation of trees and classification with linguistic constraints are introduced.