The nature of statistical learning theory
The nature of statistical learning theory
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
An introduction to variable and feature selection
The Journal of Machine Learning Research
Ranking a random feature for variable and feature selection
The Journal of Machine Learning Research
Customer Targeting: A Neural Network Approach Guided by Genetic Algorithms
Management Science
A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs
The Journal of Machine Learning Research
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
LIBLINEAR: A Library for Large Linear Classification
The Journal of Machine Learning Research
IEEE Transactions on Software Engineering
Feature selection in MLPs and SVMs based on maximum output information
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The paper is concerned with marketing applications of classification analysis. Feature selection (FS) is crucial in this domain to avoid cognitive overload of decision makers through use of excessively large attribute sets. Whereas algorithms for feature ranking have received considerable attention within the literature, a clear strategy how a subset of attributes should be selected once a ranking has been obtained is yet missing. Consequently, three candidate FS procedures are presented and contrasted by means of empirical experimentation on real-world data. The results offer some guidance which approach should be employed in practical applications and identify promising avenues for future research.