Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Artificial Intelligence Review - Special issue on lazy learning
Control-Sensitive Feature Selection for Lazy Learners
Artificial Intelligence Review - Special issue on lazy learning
Advances in knowledge discovery and data mining
Advances in knowledge discovery and data mining
Combining classifiers using correspondence analysis
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Improving Minority Class Prediction Using Case-Specific Feature Weights
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
Decomposition of Heterogeneous Classification Problems
IDA '97 Proceedings of the Second International Symposium on Advances in Intelligent Data Analysis, Reasoning about Data
Learning Feature Selection for Medical Databases
CBMS '99 Proceedings of the 12th IEEE Symposium on Computer-Based Medical Systems
Data Mining using MLC++, A Machine Learning Library in C++
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
A GA-based feature selection algorithm for remote sensing images
Evo'08 Proceedings of the 2008 conference on Applications of evolutionary computing
ACSC '11 Proceedings of the Thirty-Fourth Australasian Computer Science Conference - Volume 113
Hi-index | 0.00 |
Current electronic data repositories contain enormous amount of data, especially in medical domains, where data is often feature-space heterogeneous so that different features have different importance in different sub-areas of the whole space. In this paper, we suggest a technique that searches for a strategic splitting of the feature space identifying the best subsets of features for each instance. Our technique is based on the wrapper approach where a classification algorithm is used as the evaluation function to differentiate between several feature subsets. We apply the recently developed technique for dynamic integration of classifiers and use decision trees. For each test instance, we consider only those feature combinations that include features present in the path taken by the test instance in the decision tree. We evaluate our technique on medical datasets from the UCI machine-learning repository. The experiments show that the local feature selection is often advantageous in comparison with feature selection on the whole space.