A mathematical theory of communication
ACM SIGMOBILE Mobile Computing and Communications Review
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Conditional Mutual Information Based Feature Selection
KAM '08 Proceedings of the 2008 International Symposium on Knowledge Acquisition and Modeling
Gait feature subset selection by mutual information
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans - Special section: Best papers from the 2007 biometrics: Theory, applications, and systems (BTAS 07) conference
A new graph-theoretic approach to clustering and segmentation
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Mutual information criteria for feature selection
SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition
Localized graph-based feature selection for clustering
ICIAR'12 Proceedings of the 9th international conference on Image Analysis and Recognition - Volume Part I
Hypergraph based information-theoretic feature selection
Pattern Recognition Letters
Hi-index | 0.00 |
In many data analysis tasks, one is often confronted with very high dimensional data. The feature selection problem is essentially a combinatorial optimization problem which is computationally expensive. To overcome this problem it is frequently assumed either that features independently influence the class variable or do so only involving pairwise feature interaction. To tackle this problem, we propose an algorithm consisting of three phases, namely, i) it first constructs a graph in which each node corresponds to each feature, and each edge has a weight corresponding to mutual information (MI) between features connected by that edge, ii) then perform dominant set clustering to select a highly coherent set of features, iii) further selects features based on a new measure called multidimensional interaction information (MII). The advantage of MII is that it can consider third or higher order feature interaction. By the help of dominant set clustering, which separates features into clusters in advance, thereby allows us to limit the search space for higher order interactions. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard data-sets.