Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Parametric and nonparametric linear mappings of multidimensional data
Pattern Recognition
C4.5: programs for machine learning
C4.5: programs for machine learning
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for performance enhancement and oblivious decision graphs
Wrappers for performance enhancement and oblivious decision graphs
On the Accuracy of Meta-learning for Scalable Data Mining
Journal of Intelligent Information Systems
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimension reduction by local principal component analysis
Neural Computation
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Feature Extraction, Construction and Selection: A Data Mining Perspective
Feature Extraction, Construction and Selection: A Data Mining Perspective
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Data Mining and Knowledge Discovery: Making Sense Out of Data
IEEE Expert: Intelligent Systems and Their Applications
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
Eigenvector-Based Feature Extraction for Classification
Proceedings of the Fifteenth International Florida Artificial Intelligence Research Society Conference
Local Dimensionality Reduction for Locally Weighted Learning
CIRA '97 Proceedings of the 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation
Methods for Dynamic Classifier Selection
ICIAP '99 Proceedings of the 10th International Conference on Image Analysis and Processing
Data Mining using MLC++, A Machine Learning Library in C++
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
An extensible meta-learning approach for scalable and accurate inductive learning
An extensible meta-learning approach for scalable and accurate inductive learning
Classification and regression by combining models
Classification and regression by combining models
Comparison of Genetic Algorithm and Sequential Search Methods for Classifier Subset Selection
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Local Feature Selection with Dynamic Integration of Classifiers
Fundamenta Informaticae - Intelligent Systems
Ensembles of Classifiers Based on Approximate Reducts
Fundamenta Informaticae - Concurrency Specification and Programming (CS&P'2000)
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
IEEE Transactions on Information Technology in Biomedicine
Expert Systems with Applications: An International Journal
Thyroid disease diagnosis using Artificial Immune Recognition System (AIRS)
Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human
Fuzzy and hard clustering analysis for thyroid disease
Computer Methods and Programs in Biomedicine
Hi-index | 0.00 |
Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique of dynamic integration, in which local accuracy estimates are calculated for each base classifier of an ensemble, in the neighborhood of a new instance to be processed. Generally, the whole space of original features is used to find the neighborhood of a new instance for local accuracy estimates in dynamic integration. However, when dynamic integration takes place in high dimensions the search for the neighborhood of a new instance is problematic, since the majority of space is empty and neighbors can in fact be located far from each other. Furthermore, when noisy or irrelevant features are present it is likely that also irrelevant neighbors will be associated with a test instance. In this paper, we propose to use feature extraction in order to cope with the curse of dimensionality in the dynamic integration of classifiers. We consider classical principal component analysis and two eigenvector-based class-conditional feature extraction methods that take into account class information. Experimental results show that, on some data sets, the use of FEDIC leads to significantly higher ensemble accuracies than the use of plain dynamic integration in the space of original features.