Meta-classifiers and selective superiority
IEA/AIE '00 Proceedings of the 13th international conference on Industrial and engineering applications of artificial intelligence and expert systems: Intelligent problem solving: methodologies and approaches
Machine Learning
On Issues of Instance Selection
Data Mining and Knowledge Discovery
Applying classification algorithms in practice
Statistics and Computing
A perspective view and survey of meta-learning
Artificial Intelligence Review
Discovering Task Neighbourhoods Through Landmark Learning Performances
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
A Survey of Data Mining Techniques
ISMDA '00 Proceedings of the First International Symposium on Medical Data Analysis
Machine Learning and Its Applications, Advanced Lectures
Improved Dataset Characterisation for Meta-learning
DS '02 Proceedings of the 5th International Conference on Discovery Science
DS '01 Proceedings of the 4th International Conference on Discovery Science
Functional Trees for Regression
IDA '01 Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis
A Clustering-Based Constructive Induction Method and Its Application to Rheumatoid Arthritis
AIME '01 Proceedings of the 8th Conference on AI in Medicine in Europe: Artificial Intelligence Medicine
A comparative assessment of classification methods
Decision Support Systems
Introduction to the Special Issue on Meta-Learning
Machine Learning
Simplifying decision trees: A survey
The Knowledge Engineering Review
Machine Learning
IEEE Transactions on Knowledge and Data Engineering
Effective classification of noisy data streams with attribute-oriented dynamic classifier selection
Knowledge and Information Systems
Decision-tree instance-space decomposition with grouped gain-ratio
Information Sciences: an International Journal
A divide-and-conquer recursive approach for scaling up instance selection algorithms
Data Mining and Knowledge Discovery
Computational Statistics & Data Analysis
Constructing ensembles of classifiers by means of weighted instance selection
IEEE Transactions on Neural Networks
Artificial Intelligence Review
A new supervised learning algorithm for word sense disambiguation
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Instance selection for class imbalanced problems by means of selecting instances more than once
CAEPIA'11 Proceedings of the 14th international conference on Advances in artificial intelligence: spanish association for artificial intelligence
Mixed decision trees: an evolutionary approach
DaWaK'06 Proceedings of the 8th international conference on Data Warehousing and Knowledge Discovery
Feature evaluation and selection with cooperative game theory
Pattern Recognition
Artificial Intelligence in Medicine
On the use of data filtering techniques for credit risk prediction with instance-based models
Expert Systems with Applications: An International Journal
A scalable approach to simultaneous evolutionary instance and feature selection
Information Sciences: an International Journal
FRPS: A Fuzzy Rough Prototype Selection method
Pattern Recognition
Hi-index | 0.00 |
The results of empirical comparisons of existing learning algorithms illustrate that each algorithm has a selective superiority; each is best for some but not all tasks. Given a data set, it is often not clear beforehand which algorithm will yield the best performance. In this article we present an approach that uses characteristics of the given data set, in the form of feedback from the learning process, to guide a search for a tree-structured hybrid classifier. Heuristic knowledge about the characteristics that indicate one bias is better than another is encoded in the rule base of the Model Class Selection (MCS) system. The approach does not assume that the entire instance space is best learned using a single representation language; for some data sets, choosing to form a hybrid classifier is a better bias, and MCS has the ability to determine these cases. The results of an empirical evaluation illustrate that MCS achieves classification accuracies equal to or higher than the best of its primitive learning components for each data set, demonstrating that the heuristic rules effectively select an appropriate learning bias.