Instance-Based Learning Algorithms
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
On the Accuracy of Meta-learning for Scalable Data Mining
Journal of Intelligent Information Systems
Artificial Intelligence Review - Special issue on lazy learning
Control-Sensitive Feature Selection for Lazy Learners
Artificial Intelligence Review - Special issue on lazy learning
From data mining to knowledge discovery: an overview
Advances in knowledge discovery and data mining
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Improving Minority Class Prediction Using Case-Specific Feature Weights
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Bagging and Boosting with Dynamic Integration of Classifiers
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
Combining Multiple Models with Meta Decision Trees
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
Examining Locally Varying Weights for Nearest Neighbor Algorithms
ICCBR '97 Proceedings of the Second International Conference on Case-Based Reasoning Research and Development
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
Decomposition of Heterogeneous Classification Problems
IDA '97 Proceedings of the Second International Symposium on Advances in Intelligent Data Analysis, Reasoning about Data
Decision Committee Learning with Dynamic Integration of Classifiers
ADBIS-DASFAA '00 Proceedings of the East-European Conference on Advances in Databases and Information Systems Held Jointly with International Conference on Database Systems for Advanced Applications: Current Issues in Databases and Information Systems
Advanced Dynamic Selection of Diagnostic Methods
CBMS '98 Proceedings of the Eleventh IEEE Symposium on Computer-Based Medical Systems
Learning Feature Selection for Medical Databases
CBMS '99 Proceedings of the 12th IEEE Symposium on Computer-Based Medical Systems
An extensible meta-learning approach for scalable and accurate inductive learning
An extensible meta-learning approach for scalable and accurate inductive learning
Classification and regression by combining models
Classification and regression by combining models
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Feature Extraction for Dynamic Integration of Classifiers
Fundamenta Informaticae
Hi-index | 0.00 |
Multidimensional data is often feature space heterogeneous so that individual features have unequal importance in different sub areas of the feature space. This motivates to search for a technique that provides a strategic splitting of the instance space being able to identify the best subset of features for each instance to be classified. Our technique applies the wrapper approach where a classification algorithm is used as an evaluation function to differentiate between different feature subsets. In order to make the feature selection local, we apply the recent technique for dynamic integration of classifiers. This allows to determine which classifier and which feature subset should be used for each new instance. Decision trees are used to help to restrict the number of feature combinations analyzed. For each new instance we consider only those feature combinations that include the features present in the path taken by the new instance in the decision tree built on the whole feature set. We evaluate our technique on data sets from the UCI machine learning repository. In our experiments, we use the C4.5 algorithm as the learning algorithm for base classifiers and for the decision trees that guide the local feature selection. The experiments show some advantages of the local feature selection with dynamic integration of classifiers in comparison with the selection of one feature subset for the whole space.