Machine learning: paradigms and methods
Machine learning: paradigms and methods
Instance-Based Learning Algorithms
Machine Learning
A Nearest Hyperrectangle Learning Method
Machine Learning
Tolerating noisy, irrelevant and novel attributes in instance-based learning algorithms
International Journal of Man-Machine Studies - Special issue: symbolic problem solving in noisy and novel task environments
Classification by feature partitioning
Machine Learning
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Machine Learning
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Machine Learning
Machine Learning
Machine Learning
ICCBR '95 Proceedings of the First International Conference on Case-Based Reasoning Research and Development
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
A selective Bayes Classifier for classifying incomplete data based on gain ratio
Knowledge-Based Systems
Local distance-based classification
Knowledge-Based Systems
Learning feature-projection based classifiers
Expert Systems with Applications: An International Journal
Extending Sammon mapping with Bregman divergences
Information Sciences: an International Journal
Optimum estimation of missing values in randomized complete block design by genetic algorithm
Knowledge-Based Systems
Hi-index | 0.00 |
This paper presents Feature Interval Learning algorithms (FIL) which represent multi-concept descriptions in the form of disjoint feature intervals. The FIL algorithms are batch supervised inductive learning algorithms and use feature projections of the training instances to represent induced classification knowledge. The concept description is learned separately for each feature and is in the form of a set of disjoint intervals. The class of an unseen instance is determined by the weighted-majority voting of the feature predictions. The basic FIL algorithm is enhanced with adaptive interval and feature weight schemes in order to handle noisy and irrelevant features. The algorithms are empirically evaluated on twelve data sets from the UCI repository and are compared with k-NN, k-NNFP, and NBC classification algorithms. The experiments demonstrate that the FIL algorithms are robust to irrelevant features and missing feature values, achieve accuracy comparable to the best of the existing algorithms with significantly less average running times.