C4.5: programs for machine learning
C4.5: programs for machine learning
Advances in the Dempster-Shafer theory of evidence
Information Sciences: an International Journal
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
The algorithm on knowledge reduction in incomplete information systems
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Machine Learning
Rough sets and intelligent data analysis
Information Sciences—Informatics and Computer Science: An International Journal
A Rough Set-Aided System for Sorting WWW Bookmarks
WI '01 Proceedings of the First Asia-Pacific Conference on Web Intelligence: Research and Development
Reduction and axiomization of covering generalized rough sets
Information Sciences: an International Journal
Approaches to knowledge reduction based on variable precision rough set model
Information Sciences—Informatics and Computer Science: An International Journal - Mining stream data
Mining diagnostic rules from clinical databases using rough sets and medical diagnostic model
Information Sciences: an International Journal - Special issue: Medical expert systems
Semantics-Preserving Dimensionality Reduction: Rough and Fuzzy-Rough-Based Approaches
IEEE Transactions on Knowledge and Data Engineering
Information-preserving hybrid data reduction based on fuzzy-rough techniques
Pattern Recognition Letters
Information Sciences: an International Journal
A weighted rough set based method developed for class imbalance learning
Information Sciences: an International Journal
Attribute reduction in decision-theoretic rough set models
Information Sciences: an International Journal
RSEISP '07 Proceedings of the international conference on Rough Sets and Intelligent Systems Paradigms
Information Sciences: an International Journal
Relative reducts in consistent and inconsistent decision tables of the Pawlak rough set model
Information Sciences: an International Journal
Knowledge reduction in random information systems via Dempster-Shafer theory of evidence
Information Sciences: an International Journal
Positive approximation: An accelerator for attribute reduction in rough set theory
Artificial Intelligence
Approximation reduction in inconsistent incomplete decision tables
Knowledge-Based Systems
Rule learning for classification based on neighborhood covering reduction
Information Sciences: an International Journal
An effective discretization based on Class-Attribute Coherence Maximization
Pattern Recognition Letters
Knowledge reduction in real decision formal contexts
Information Sciences: an International Journal
Extended rough set-based attribute reduction in inconsistent incomplete decision systems
Information Sciences: an International Journal
Test-cost-sensitive attribute reduction
Information Sciences: an International Journal
Fundamenta Informaticae
Normalized Decision Functions and Measures for Inconsistent Decision Tables Analysis
Fundamenta Informaticae
A novel and better fitness evaluation for rough set based minimum attribute reduction problem
Information Sciences: an International Journal
Hi-index | 0.07 |
This paper focuses on three types of attribute reducts in inconsistent decision tables: assignment reduct, distribution reduct, and maximum distribution reduct. It is quite inconvenient to judge these three types of reduct directly according to their definitions. This paper proposes judgment theorems for the assignment reduct, the distribution reduct and the maximum distribution reduct, which are expected to greatly simplify the judging of these three types of reducts. On this basis, we derive three new types of attribute significance measures and construct the Q-ARA (Quick Assignment Reduction Algorithm), the Q-DRA (Quick Distribution Reduction Algorithm), and the Q-MDRA (Quick Maximum Distribution Reduction Algorithm). These three algorithms correspond to the three types of reducts. We conduct a series of comparative experiments with twelve UCI (machine learning data repository, University of California at Irvine) data sets (including consistent and inconsistent decision tables) to evaluate the performance of the three reduction algorithms proposed with the relevant algorithm QuickReduct [9,34]. The experimental results show that QuickReduct possesses weak robustness because it cannot find the reduct even for consistent data sets, whereas our proposed three algorithms show strong robustness because they can find the reduct for each data set. In addition, we compare the Q-DRA (Quick Distribution Reduction Algorithm) with the CEBARKNC (conditional entropy-based algorithm for reduction of knowledge without a computing core) [43] because both find the distribution reduct by using a heuristic search. The experimental results demonstrate that Q-DRA runs faster than CEBARKNC does because the distribution function of Q-DRA has a lower calculation cost. Instructive conclusions for these reduction algorithms are drawn from the perspective of classification performance for the C4.5 and RBF-SVM classifiers. Last, we make a comparison between discernibility matrix-based methods and our algorithms. The experimental results indicate that our algorithms are efficient and feasible.