A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Minimum Redundancy Feature Selection from Microarray Gene Expression Data
CSB '03 Proceedings of the IEEE Computer Society Conference on Bioinformatics
An introduction to variable and feature selection
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection with dynamic mutual information
Pattern Recognition
On the use of variable complementarity for feature selection in cancer classification
EuroGP'06 Proceedings of the 2006 international conference on Applications of Evolutionary Computing
The Journal of Machine Learning Research
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Feature selection for multi-label classification using multivariate mutual information
Pattern Recognition Letters
Hi-index | 0.10 |
Feature selection plays an important role in classification algorithms. It is particularly useful in dimensionality reduction for selecting features with high discriminative power. This paper introduces a new feature-selection method called Feature Interaction Maximisation (FIM), which employs three-way interaction information as a measure of feature redundancy. It uses a forward greedy search to select features which have maximum interaction information with the features already selected, and which provide maximum relevance. The experiments conducted to verify the performance of the proposed method use three datasets from the UCI repository. The method is compared with four other well-known feature-selection methods: Information Gain (IG), Minimum Redundancy Maximum Relevance (mRMR), Double Input Symmetrical Relevance (DISR), and Interaction Gain Based Feature Selection (IGFS). The average classification accuracy of two classifiers, Naive Bayes and K-nearest neighbour, is used to assess the performance of the new feature-selection method. The results show that FIM outperforms the other methods.