Communications of the ACM - Special issue on parallelism
Feature selection for automatic classification of non-Gaussian data
IEEE Transactions on Systems, Man and Cybernetics - Special issue on artificial intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Original Contribution: Stacked generalization
Neural Networks
Experiments on multistrategy learning by meta-learning
CIKM '93 Proceedings of the second international conference on Information and knowledge management
Machine Learning
ACM Computing Surveys (CSUR)
Data mining methods for knowledge discovery
Data mining methods for knowledge discovery
Rough Sets: Theoretical Aspects of Reasoning about Data
Rough Sets: Theoretical Aspects of Reasoning about Data
Feature Extraction, Construction and Selection: A Data Mining Perspective
Feature Extraction, Construction and Selection: A Data Mining Perspective
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Rough set methods in feature selection and recognition
Pattern Recognition Letters - Special issue: Rough sets, pattern recognition and data mining
Further Research on Feature Selection and Classification Using Genetic Algorithms
Proceedings of the 5th International Conference on Genetic Algorithms
A New Version of Rough Set Exploration System
TSCTC '02 Proceedings of the Third International Conference on Rough Sets and Current Trends in Computing
On Using Rule Induction in Multiple Classifiers with a Combiner Aggregation Strategy
ISDA '05 Proceedings of the 5th International Conference on Intelligent Systems Design and Applications
Data Mining Exploration System for Feature Selection Tasks
ICHIT '06 Proceedings of the 2006 International Conference on Hybrid Information Technology - Volume 01
On k-NN Method with Preprocessing
Fundamenta Informaticae
A Rough Set Approach to Multiple Classifier Systems
Fundamenta Informaticae - SPECIAL ISSUE ON CONCURRENCY SPECIFICATION AND PROGRAMMING (CS&P 2005) Ruciane-Nide, Poland, 28-30 September 2005
Multiple Classifiers for Unconstrained Offline Handwritten Numeral Recognition
ICCIMA '07 Proceedings of the International Conference on Computational Intelligence and Multimedia Applications (ICCIMA 2007) - Volume 02
Proceedings of the 7th international conference on Multiple classifier systems
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Attribute selection and rule generation techniques for medical diagnosis systems
RSFDGrC'05 Proceedings of the 10th international conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing - Volume Part II
Analogy-based reasoning in classifier construction
Transactions on Rough Sets IV
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Reducts Evaluation Methods Using Lazy Algorithms
RSKT '09 Proceedings of the 4th International Conference on Rough Sets and Knowledge Technology
A novel feature selection method and its application
Journal of Intelligent Information Systems
Hi-index | 0.00 |
Many problems in pattern classification and knowledge discovery require a selection of a subset of attributes or features to represent the patterns to be classified. The approach presented in this paper is designed mostly for multiple classifier systems with homogeneous (identical) classifiers. Such systems require many different subsets of the data set. The problem of finding the best subsets of a given feature set is of exponential complexity. The main aim of this paper is to present ways to improve RBFS algorithm which is a feature selection algorithm. RBFS algorithm is computationally quite complex because it uses all decision-relative reducts of a given decision table. In order to increase its speed, we propose a new algorithm called ARS algorithm. The task of this algorithm is to decrease the number of the decision-relative reducts for a decision table. Experiments have shown that ARS has greatly improved the execution time of the RBFS algorithm. A small loss on the classification accuracy of the multiple classifier used on the subset created by this algorithm has also been observed. To improve classification accuracy the simplified version of the bagging algorithm has been applied. Algorithms have been tested on some benchmarks.