Communications of the ACM
Instance-Based Learning Algorithms
Machine Learning
Selecting typical instances in instance-based learning
ML92 Proceedings of the ninth international workshop on Machine learning
Using neural networks for data mining
Future Generation Computer Systems - Special double issue on data mining
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Data mining: concepts and techniques
Data mining: concepts and techniques
Data Mining: Concepts, Models, Methods and Algorithms
Data Mining: Concepts, Models, Methods and Algorithms
On Issues of Instance Selection
Data Mining and Knowledge Discovery
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Symbolic Interpretation of Artificial Neural Networks
IEEE Transactions on Knowledge and Data Engineering
The Effects of Training Set Size on Decision Tree Complexity
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Extracting symbolic rules from trained neural network ensembles
AI Communications - Special issue on Artificial intelligence advances in China
A Compact and Accurate Model for Classification
IEEE Transactions on Knowledge and Data Engineering
An Integrated Data Preparation Scheme for Neural Network Data Analysis
IEEE Transactions on Knowledge and Data Engineering
Combining Feature Reduction and Case Selection in Building CBR Classifiers
IEEE Transactions on Knowledge and Data Engineering
Use of Instance Typicality for Efficient Detection of Outliers with Neural Network Classifiers
ICIT '06 Proceedings of the 9th International Conference on Information Technology
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Weighted Instance Typicality Search (WITS): A nearest neighbor data reduction algorithm
Intelligent Data Analysis
Spatial Point-Data Reduction Using Pulse Coupled Neural Network
Neural Processing Letters
Hi-index | 0.00 |
Instance selection is often used in case of lazy classifiers.This paper addresses the need of instance selection in case ofneural network and decision tree classifiers and presents a novelSupervised Instance Selection (SIS) algorithm. Initially, a neuralnetwork classifier is constructed using all training instances. Thealgorithm then selects a few instances using the certainty valuesof the wrapped neural network to construct a compact classifier.Empirical study made with standard datasets shows that SIS save on70% of storage space without degrading the accuracy. It isindependent of nature of the dataset and the tool used.