Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Making large-scale support vector machine learning practical
Advances in kernel methods
Learning to remove Internet advertisements
Proceedings of the third annual conference on Autonomous Agents
Sharing the cost of multicast transmissions
Journal of Computer and System Sciences - Special issue on Internet algorithms
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Grafting: fast, incremental feature selection by gradient descent in function space
The Journal of Machine Learning Research
Mlps (mono layer polynomials and multi layer perceptrons) for nonlinear modeling
The Journal of Machine Learning Research
Ranking a random feature for variable and feature selection
The Journal of Machine Learning Research
Nonideal Iris Recognition Using Level Set Approach and Coalitional Game Theory
ICVS '09 Proceedings of the 7th International Conference on Computer Vision Systems: Computer Vision Systems
An Efficient Explanation of Individual Classifications using Game Theory
The Journal of Machine Learning Research
A game theoretic approach for feature clustering and its application to feature selection
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part I
Strengthening learning algorithms by feature discovery
Information Sciences: an International Journal
Facial expression recognition using game theory
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Adaptive attribute selection for configurator design via shapley value
Artificial Intelligence for Engineering Design, Analysis and Manufacturing
Hi-index | 0.00 |
We present and study the contribution-selection algorithm (CSA), a novel algorithm for feature selection. The algorithm is based on the multiperturbation shapley analysis (MSA), a framework that relies on game theory to estimate usefulness. The algorithm iteratively estimates the usefulness of features and selects them accordingly, using either forward selection or backward elimination. It can optimize various performance measures over unseen data such as accuracy, balanced error rate, and area under receiver-operator-characteristic curve. Empirical comparison with several other existing feature selection methods shows that the backward elimination variant of CSA leads to the most accurate classification results on an array of data sets.