Instance-Based Learning Algorithms
Machine Learning
Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
Generalizing from case studies: a case study
ML92 Proceedings of the ninth international workshop on Machine learning
Characterizing the applicability of classification algorithms using meta-level learning
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Comparing connectionist and symbolic learning methods
Proceedings of a workshop on Computational learning theory and natural learning systems (vol. 1) : constraints and prospects: constraints and prospects
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
A note on comparing classifiers
Pattern Recognition Letters
Machine Learning
Mining frequent patterns without candidate generation
SIGMOD '00 Proceedings of the 2000 ACM SIGMOD international conference on Management of data
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Complexity Measures of Supervised Classification Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Machine Learning
Characterization of Classification Algorithms
EPIA '95 Proceedings of the 7th Portuguese Conference on Artificial Intelligence: Progress in Artificial Intelligence
Ordering Effects in Clustering
ML '92 Proceedings of the Ninth International Workshop on Machine Learning
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
CMAR: Accurate and Efficient Classification Based on Multiple Class-Association Rules
ICDM '01 Proceedings of the 2001 IEEE International Conference on Data Mining
Zoomed Ranking: Selection of Classification Algorithms Based on Relevant Performance Information
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
Feature Selection for Machine Learning: Comparing a Correlation-Based Filter Approach to the Wrapper
Proceedings of the Twelfth International Florida Artificial Intelligence Research Society Conference
On the Nonlinearity of Pattern Classifiers
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
On Data and Algorithms: Understanding Inductive Performance
Machine Learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Internet traffic classification using bayesian analysis techniques
SIGMETRICS '05 Proceedings of the 2005 ACM SIGMETRICS international conference on Measurement and modeling of computer systems
Distances between Data Sets Based on Summary Statistics
The Journal of Machine Learning Research
MCAR: multi-class classification based on association rule
AICCSA '05 Proceedings of the ACS/IEEE 2005 International Conference on Computer Systems and Applications
Cross-disciplinary perspectives on meta-learning for algorithm selection
ACM Computing Surveys (CSUR)
Searching for interacting features in subset selection
Intelligent Data Analysis
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
On learning algorithm selection for classification
Applied Soft Computing
A Weighted Voting-Based Associative Classification Algorithm
The Computer Journal
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Domain of competence of XCS classifier system in complexity measurement space
IEEE Transactions on Evolutionary Computation
A feature subset selection algorithm automatic recommendation method
Journal of Artificial Intelligence Research
Hi-index | 0.01 |
Choosing appropriate classification algorithms for a given data set is very important and useful in practice but also is full of challenges. In this paper, a method of recommending classification algorithms is proposed. Firstly the feature vectors of data sets are extracted using a novel method and the performance of classification algorithms on the data sets is evaluated. Then the feature vector of a new data set is extracted, and its k nearest data sets are identified. Afterwards, the classification algorithms of the nearest data sets are recommended to the new data set. The proposed data set feature extraction method uses structural and statistical information to characterize data sets, which is quite different from the existing methods. To evaluate the performance of the proposed classification algorithm recommendation method and the data set feature extraction method, extensive experiments with the 17 different types of classification algorithms, the three different types of data set characterization methods and all possible numbers of the nearest data sets are conducted upon the 84 publicly available UCI data sets. The results indicate that the proposed method is effective and can be used in practice.